Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Getting Start

Below are the course contents that will be noted in this particular parent section “Deep Learning Course Content”

Course Content


Unit 1: Foundations & Applied Math (8 Hours)

  • Introduction and History: Motivation for Deep Learning; Historical trends; Success stories.
  • Linear Algebra & Probability: Tensors, Eigendecomposition, Information Theory, and Numerical Optimization.
  • Bayesian Decision Theory: Making optimal decisions under uncertainty, inference vs. decision, and loss functions for classification/regression.
  • Machine Learning Basics: Capacity, Overfitting/Underfitting, Hyperparameters, and the Bias-Variance tradeoff.

Unit 2: Deep Networks & Training Optimization (12 Hours)

  • Deep Feedforward Networks: Multilayer Perceptrons (MLP); Gradient-Based Learning; Backpropagation and the Chain Rule.
  • Modern Regularization: L1/L2 penalties, Dropout, Early Stopping, and Dataset Augmentation.
  • Optimization & Normalization: SGD, Momentum, Adam Optimizer; Batch Normalization and Layer Normalization.

Unit 3: Convolutional Networks & Computer Vision (10 Hours)

  • The Convolution Operation: Motivation, Pooling, and the Neuroscientific basis for CNNs.
  • Modern Vision Architectures: Residual Networks (ResNets), Inception, and Deep CNN variants.
  • Advanced Vision Tasks: Object Detection (YOLO/SSD), Semantic Segmentation, and the U-Net architecture.

Unit 4: Sequence Modeling & The Attention Revolution (10 Hours)

  • Recurrent Neural Networks: RNNs, the Vanishing Gradient problem, and Gated Units (LSTM and GRU).
  • The Attention Mechanism: Self-Attention, Multi-head Attention, and the “Attention is All You Need” paradigm.
  • The Transformer Blueprint: Encoder-Decoder architecture, Positional Encoding, and scaling to Large Language Models (LLMs).

Unit 5: Frontiers: Generative & Graph Models (8 Hours)

  • Autoencoders & Latent Spaces: Undercomplete autoencoders and Representation Learning.
  • Generative AI: Variational Autoencoders (VAEs) and Diffusion Models.
  • Graph Neural Networks: Message Passing, Node Embeddings, and Graph Convolutional Networks (GCNs).

Notation