0% found this document useful (0 votes)
8 views

Essentials of Deep Learning

The Essentials of Deep Learning course (MAS5005) is a 3-credit course that requires a prerequisite in Machine Learning. It aims to provide students with knowledge in machine learning basics, optimization of deep models, and understanding of various neural network architectures. The course includes lectures, practical experiments, and assessments, with a focus on both foundational and advanced deep learning methods.

Uploaded by

Ajay Sharma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Essentials of Deep Learning

The Essentials of Deep Learning course (MAS5005) is a 3-credit course that requires a prerequisite in Machine Learning. It aims to provide students with knowledge in machine learning basics, optimization of deep models, and understanding of various neural network architectures. The course includes lectures, practical experiments, and assessments, with a focus on both foundational and advanced deep learning methods.

Uploaded by

Ajay Sharma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Course Type LP

Essentials of Deep Learning

Course Code: Credits 3


MAS5005
Prerequisite: Machine learning
Course Objectives:
1. Gain knowledge in Machine Learning Basics
2. Understand and apply Optimization on Deep Models and Networks
3. Understand and analyze Recurrent and Recursive Networks
4. Understand the representation of neural networks in machine learning.
Course Outcomes (CO):

At the end of the course, students should be able to


CO1. To explore the fundamentals of Machine Learning Algorithms.
CO2. To Analyze Deep learning Mathematical Models
CO3. To Elucidate the Deep Feed forward Networks.
CO4. To Apply knowledge for Optimization on Deep Models and Convolutional Networks.
CO5. To Elucidate the Recurrent and Recursive Networks and Natural language Processing
Correlation of COs with POs
CO CKL
\PO
PO1 PO2 PO3 PSO1 PSO2
PKL 3 5 6 3 3
CO1 2 3 2 1 3 3

CO2 2 3 2 2 3 3

CO3 2 3 2 2 3 3

CO4 3 3 2 2 3 3

CO5 3 3 2 2 3 3
CO Topics to be discussed Hrs.
CO1 Introduction: Historical Trends in Deep Learning - Linear Algebra: Scalars - 06
Vectors - Matrices - Tensors - Matrices - Norms – Eigen decomposition - Probability and
Information Theory: Random variable and distributed Probability
- Bayes Rule -Information Theory and structured probabilistic models.

Machine Learning: Introduction to supervised and unsupervised learning. Numerical


Computation: Overflow and Underflow – Gradient-based Optimization – Constrained
Optimization - Learning Algorithms: Capacity - Overfitting – Underfitting - Bayesian
Classification - Supervised - unsupervised algorithms - Building machine learning
algorithm.
CO2 Fundamental Deep Learning Methods: Artificial Neural Networks (ANN): 08
Perceptron, learning laws, layers, back propagation - The scope of learning, popular
architectures, An overview of Parallel and Distributed Processes (PDP), linear associative
models, and stochastic networks; - Convolutional Neural Networks (CNN): convolution,
filters, pooling, stride, drop out, layers and applications; - Recurrent Neural Networks
(RNN): unfolding, Backpropagation Through Time (BPTT), LSTM models, bidirectional
networks, encoder, decoder and attention models..
CO3 Advanced Deep Learning Methods: Variational Autoencoder (VAE), 08
Deep Autoencoder (DAE). Generative Adversarial Network (GAN), Deep Boltzmann
Machines, Deep Neural Network applications for multimedia, Sequence and Streaming
data, Deep Belief Network. Long Short Term Memory Networks (LSTMs), Deep Belief
Networks (DBNs) and Multilayer Perceptrons (MLPs).
CO4 Optimization on deep models: Optimization for Training Deep 08
Models: Challenges in Neural Networks optimization - Basic Algorithms – Algorithms
Adaptive learning Rates - Approximate Second Order Methods - Optimization
Strategies and Meta Algorithms - Convolutional Networks: Motivation - Structured
Output - Unsupervised features –Neuro scientific basics for Convolutional Networks.
CO5 Recurrent and recursive networks: Computational graphs - 08
Recurrent Neural networks - Bidirectional RNN - Deep Recurrent Networks - Echo State
Networks - Practical Methodology Applications: Large Scale Deep Learning, Case
studies in classification, Regression and deep networks.
Guest Lecture on Contemporary Topics 02
Total Lecture: 40
Mode of Teaching and Learning:
Flipped Classroom, Activity-Based Teaching/Learning, Digital/Computer-based models, wherever possible
to augment lecture for practice/tutorial and minimum 2 hours lectures by industry experts on contemporary
topics.
Mode of Evaluation and assessment:
The assessment and evaluation components may consist of unannounced open book examinations, quizzes,
student’s portfolio generation and assessment, and any other innovative assessment practices followed by
faculty, in addition to the Continuous Assessment Tests and Term End Examinations.
An indicative list of Experiments: The following software experiments can be performed. (Using Python
Platform)

• Based on Artificial Neural Network


• Based on Convolution Neural Network
• Based on Recurrent Neural Network
• Based on Auto encoder
• Based on Generative Adversarial Network
• Based on Deep Belief Network
Text Book(s):
1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, Deep Learning, MIT Press, 2016.
2. Michael Nielsen, Neural Networks and Deep Learning, Determination Press, 2015.
Reference Book(s):
 Deng & Yu, Deep Learning: Methods and Applications, Now Publishers, 2013. 2. Russell.
 S and Norvig, N. Artificial Intelligence: A Modern Approach. Prentice-Hall Series in Artificial
Intelligence. 2003.
 Bishop, C. M. Neural Networks for Pattern Recognition. Oxford University Press. 1995.

Recommendation by the Board of Studies on 9.1.2023.


Approval by Academic council on: Yet to be approved
Compiled by: Dr.Pon Harshavardhanan

You might also like