0% found this document useful (0 votes)
95 views3 pages

Deep Learning Exam Notes

The document outlines key applications of deep learning in Natural Language Processing, including machine translation and sentiment analysis. It compares various optimization methods for deep learning, highlighting their pros and cons. Additionally, it discusses deep network intuition, variants of RNN architectures, and methods for bootstrapping and cross-validation.

Uploaded by

238w5a5407
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Topics covered

  • Data Splitting,
  • Training Models,
  • Bidirectional RNN,
  • Deep Learning,
  • Sentiment Analysis,
  • RMSProp,
  • Performance Evaluation,
  • Recurrent Neural Networks,
  • Gradient Updates,
  • Data Resampling
0% found this document useful (0 votes)
95 views3 pages

Deep Learning Exam Notes

The document outlines key applications of deep learning in Natural Language Processing, including machine translation and sentiment analysis. It compares various optimization methods for deep learning, highlighting their pros and cons. Additionally, it discusses deep network intuition, variants of RNN architectures, and methods for bootstrapping and cross-validation.

Uploaded by

238w5a5407
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Topics covered

  • Data Splitting,
  • Training Models,
  • Bidirectional RNN,
  • Deep Learning,
  • Sentiment Analysis,
  • RMSProp,
  • Performance Evaluation,
  • Recurrent Neural Networks,
  • Gradient Updates,
  • Data Resampling

Deep Learning Exam Preparation Notes (8-Mark Questions)

1. Applications of Deep Learning in Natural Language Processing (NLP)

Key Applications:

- Machine Translation - Translating text between languages (e.g., English to French) using Seq2Seq and T

- Text Summarization - Creating concise summaries of long texts using abstractive models.

- Sentiment Analysis - Classifying the emotional tone of text (positive/negative/neutral).

- Speech Recognition - Converting speech to text using RNNs, LSTMs, and attention models.

- Chatbots and Virtual Assistants - Contextual responses using deep conversational models (e.g., Siri, Alex

- Question Answering - Extracting precise answers using models like BERT.

- Named Entity Recognition (NER) - Identifying names, places, dates in text.

2. Comparison of Optimization Methods for Deep Learning

SGD: Basic gradient update. Pros: Simple, low memory. Cons: Slow, sensitive to learning rate.

Momentum: Adds past gradients to smooth updates. Pros: Faster convergence. Cons: Needs tuning.

RMSProp: Normalizes by squared gradients. Pros: Effective for RNNs. Cons: May generalize poorly.

Adam: Combines Momentum and RMSProp. Pros: Fast, adaptive. Cons: More memory use.

Adagrad: Adaptive learning rates. Pros: Good for sparse data. Cons: Learning rate decays fast.

3. Step-by-Step Intuition Building for Deep Networks

- Understand Single Neuron - Basics of perceptron and logistic regression.

- Multi-Layer Perceptrons (MLPs) - Non-linear decision boundaries.

- Activation Functions - Introduce non-linearity (ReLU, Sigmoid, Tanh).

- Forward Propagation - Input flows to generate output.


- Backpropagation - Learning via gradient calculation.

- Convolution Layers - Image feature extraction.

- Recurrent Layers - Sequence learning.

- Regularization Techniques - Dropout, BatchNorm.

- Loss Functions - Guide learning (Cross-Entropy, MSE).

- Practice - Apply in real-world tasks.

4. Variants of Recurrent Neural Network Architectures

- Standard RNN: Short-term memory, vanishing gradients.

- LSTM: Long-term memory via gating.

- GRU: Simpler LSTM, fast training.

- Bidirectional RNN: Processes sequence forward & backward.

- Deep RNN: Stacked layers for complex sequences.

- Attention-based RNNs: Focus on important input parts.

5. Bootstrapping and Cross-Validation

Bootstrapping:

- Resample data with replacement.

- Train models on bootstrapped sets.

- Evaluate on original or out-of-bag data.

Cross-Validation (k-Fold):

- Split data into k parts.

- Train on k-1 and validate on the kth.

- Repeat k times and average results.


Comparison:

- Bootstrapping = uncertainty estimation.

- Cross-validation = performance evaluation.

You might also like