deep learning specialization on coursera by deeplearning.ai
- Welcome
- What is a neural network?
- Supervised Learning with Neural Networks
- Why is Deep Learning taking off?
- Geoffrey Hinton interview
Learn to set up a machine learning problem with a neural network mindset. Learn to use vectorization to speed up your models.
- Binary Classification
- Logistic Regression
- Logistic Regression Cost Function
- Gradient Descent
- Derivatives,More Derivative Examples
- Computation graph ,Derivatives with a Computation Graph
- Logistic Regression Gradient Descent
- Gradient Descent on m Examples
- Vectorization, More Vectorization Examples6m
- Vectorizing Logistic Regression
- Vectorizing Logistic Regression's Gradient Output
- Broadcasting in Python
- A note on python/numpy vectors
- Explanation of logistic regression cost function
- Pieter Abbeel interview
- Assignment : Logistic Regression with a Neural Network mindset
Learn to build a neural network with one hidden layer, using forward propagation and backpropagation.
- Neural Networks Overview, Neural Network Representation
- Computing a Neural Network's Output
- Vectorizing across multiple examples
- Activation functions
- Why do you need non-linear activation functions? ,Derivatives of activation functions
- Gradient descent for Neural Networks
- Backpropagation intuition
- Random Initialization
- Assignment :Planar data classification with one hidden layer
Understand the key computations underlying deep learning, use them to build and train deep neural networks, and apply it to computer vision.
- Deep L-layer neural network
- Forward Propagation in a Deep Network
- Getting your matrix dimensions right
- Why deep representations?
- Building blocks of deep neural networks
- Forward and Backward Propagation
- Parameters vs Hyperparameters
- Assignment :Building your Deep Neural Network: Step by Step
- Assignment :Deep Neural Network for Image Classification: Application
- Train / Dev / Test sets
- Bias / Variance
- Basic Recipe for Machine Learning
- Regularization
- Why regularization reduces overfitting?
- Dropout Regularization
- Understanding Dropout
- Other regularization methods
- Normalizing inputs
- Vanishing / Exploding gradients
- Weight Initialization for Deep Networks
- Numerical approximation of gradients
- Gradient checking
- Assignment :Gradient Checking from scratch
- Assignment : Initialization (different types) of Neural Network from scratch
- Assignment : Regularization different types) of Neural Network from scratch
- Mini-batch gradient descent
- Understanding mini-batch gradient descent
- Exponentially weighted averages
- Understanding exponentially weighted averages
- Bias correction in exponentially weighted averages
- Gradient descent with momentum
- RMSprop
- Adam optimization algorithm
- Learning rate decay
- The problem of local optima
- Assignment : Optimization Methods for Neural Networkfrom scratch
- Tuning process
- Using an appropriate scale to pick hyperparameters
- Hyperparameters tuning in practice: Pandas vs. Caviar
- Normalizing activations in a network
- Fitting Batch Norm into a neural network
- Why does Batch Norm work?
- Batch Norm at test time
- Softmax Regression
- Training a softmax classifier
- Deep learning frameworks
- TensorFlow
- Assignment :Neural network using tensorflow
- Why ML Strategy
- Orthogonalization
- Single number evaluation metric
- Satisficing and Optimizing metric
- Train/dev/test distributions
- Size of the dev and test sets
- When to change dev/test sets and metrics
- Why human-level performance?
- Avoidable bias
- Understanding human-level performance,Surpassing human-level performance
- Improving your model performance
- Quiz (case study) : Bird recognition in the city of Peacetopia (case study)
- Carrying out error analysis
- Cleaning up incorrectly labeled data
- Build your first system quickly, then iterate
- Training and testing on different distributions
- Bias and Variance with mismatched data distributions
- Addressing data mismatch
- Transfer learning
- Multi-task learning
- What is end-to-end deep learning?
- Whether to use end-to-end deep learning
- Quiz (case study) :Autonomous driving (case study)
Learn to implement the foundational layers of CNNs (pooling, convolutions) and to stack them properly in a deep network to solve multi-class image classification problems.
- Computer Vision
- Edge Detection Example
- More Edge Detection
- Padding
- Strided Convolutions
- Convolutions Over Volume
- One Layer of a Convolutional Network
- Simple Convolutional Network Example
- Pooling Layers
- CNN Example
- Why Convolutions?
- Assignment :Convolutional Neural Networks: Step by Step (from scratch)
Learn about the practical tricks and methods used in deep CNNs straight from the research papers.
- Classic Networks
- ResNets
- Why ResNets Work?
- Networks in Networks and 1x1 Convolutions
- Inception Network Motivation
- Inception Network
- Using Open-Source Implementation
- Transfer Learning
- Data Augmentation
- State of Computer Vision
- Assignment :Keras tutorial - Emotion Detection in Images of Faces
- Assignment :Residual Networks
Learn how to apply your knowledge of CNNs to one of the toughest but hottest field of computer vision: Object detection.
- Object Localization
- Landmark Detection
- Object Detection
- Convolutional Implementation of Sliding Windows
- Bounding Box Predictions
- Intersection Over Union
- Non-max Suppression
- Anchor Boxes
- YOLO Algorithm
- Assignment :Autonomous driving - Car detection
Discover how CNNs can be applied to multiple fields, including art generation and face recognition. Implement your own algorithm to generate art and recognize faces!
- What is face recognition?
- One Shot Learning
- Siamese Network
- Triplet Loss
- Face Verification and Binary Classification
- What is neural style transfer?
- What are deep ConvNets learning?
- Cost Function
- Content Cost Function
- Style Cost Function
- 1D and 3D Generalizations
- Assignment :Face Recognition , Face Verification
- Assignment :Deep Learning & Art: Neural Style Transfer
Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section.
- Why sequence models
- Notation
- Recurrent Neural Network Model
- Backpropagation through time
- Different types of RNNs
- Language model and sequence generation
- Sampling novel sequences
- Vanishing gradients with RNNs
- Gated Recurrent Unit (GRU)
- Long Short Term Memory (LSTM)
- Bidirectional RNN
- Deep RNNs5
- Assignment :Building your Recurrent Neural Network - Step by Step(from scratch)
Natural language processing with deep learning is an important combination. Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation.
- Word Representation
- Using word embeddings
- Properties of word embeddings
- Embedding matrix
- Learning word embeddings
- Word2Vec
- Negative Sampling
- GloVe word vectors
- Sentiment Classification
- Debiasing word embeddings
- Assignment :Operations on word vectors
Sequence models can be augmented using an attention mechanism. This algorithm will help your model understand where it should
- Tuning process
- Basic Models
- Picking the most likely sentence
- Beam Search
- Refinements to Beam Search
- Error analysis in beam search
- Bleu Score (optional)
- Attention Model Intuition
- Attention Model
- Speech recognition
- Trigger Word Detection