Week 1: Neural Networks, Backpropagation & The Deep Learning Revolution
Build neural networks from scratch, master CNNs, RNNs, and Transformers, and train deep models using modern techniques in PyTorch.
- Implement a neural network forward and backward pass from scratch
- Understand and apply batch normalization and dropout
- Build and train a CNN for image classification
- Apply transfer learning to a real problem
This first lecture establishes the foundational framework for Deep Learning. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.
Key Concepts
The lecture introduces the four main pillars of this course: Perceptrons & MLP from Scratch, Backpropagation Mathematics, CNNs: Convolution & Pooling, Regularization: Dropout & BatchNorm. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.
This Week's Focus
Focus on mastering: Perceptrons & MLP from Scratch and Backpropagation Mathematics. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.
DS301 Project 1: Image Classifier from Scratch
Train a convolutional neural network on CIFAR-10 from scratch in PyTorch. Achieve >80% validation accuracy using techniques taught in this course.
- PyTorch CNN implementation
- Training loop with validation and early stopping
- Ablation study: architecture choices vs accuracy
- Confusion matrix and error analysis
These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.
Explain the vanishing gradient problem and describe two techniques that mitigate it.
What does batch normalization do mathematically, and why does it help training?
Write PyTorch code defining a Conv2D block with batch normalization, ReLU, and max pooling.