🎓 University of America — Course Portal
Data ScienceDS301 › Week 1
📊 Data Science Week 1 of 14 BSc · Y3 S1 ⏱ ~50 min

Week 1: Neural Networks, Backpropagation & The Deep Learning Revolution

Build neural networks from scratch, master CNNs, RNNs, and Transformers, and train deep models using modern techniques in PyTorch.

UA
University of America
DS301 — Lecture 1 · BSc Y3 S1
🎬 CC Licensed Lecture
0:00 / —:—— 📺 MIT OpenCourseWare (CC BY-NC-SA)
🎯 Learning Objectives
  • Implement a neural network forward and backward pass from scratch
  • Understand and apply batch normalization and dropout
  • Build and train a CNN for image classification
  • Apply transfer learning to a real problem
Topics Covered This Lecture
Perceptrons & MLP from Scratch
Backpropagation Mathematics
CNNs: Convolution & Pooling
Regularization: Dropout & BatchNorm
📖 Lecture Overview

This first lecture establishes the foundational framework for Deep Learning. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.

Why this matters Build neural networks from scratch, master CNNs, RNNs, and Transformers, and train deep models using modern techniques in PyTorch. This lecture sets up everything that follows — make sure you understand the core concepts before proceeding to Week 2.

Key Concepts

The lecture introduces the four main pillars of this course: Perceptrons & MLP from Scratch, Backpropagation Mathematics, CNNs: Convolution & Pooling, Regularization: Dropout & BatchNorm. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.

# Quick Start: verify your environment is ready for DS301 import sys print(f"Python {sys.version}") # Check key libraries are installed try: import numpy, pandas, matplotlib print("✅ Core libraries ready") except ImportError as e: print(f"❌ Missing: {e} — run: pip install numpy pandas matplotlib")

This Week's Focus

Focus on mastering: Perceptrons & MLP from Scratch and Backpropagation Mathematics. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.

📋 Project 1 of 3 50% of Final Grade

DS301 Project 1: Image Classifier from Scratch

Train a convolutional neural network on CIFAR-10 from scratch in PyTorch. Achieve >80% validation accuracy using techniques taught in this course.

  • PyTorch CNN implementation
  • Training loop with validation and early stopping
  • Ablation study: architecture choices vs accuracy
  • Confusion matrix and error analysis
50%
3 Projects
20%
Midterm Exam
30%
Final Exam
📝 Sample Exam Questions

These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.

Conceptual Short Answer

Explain the vanishing gradient problem and describe two techniques that mitigate it.

Analysis Short Answer

What does batch normalization do mathematically, and why does it help training?

Applied Code / Proof

Write PyTorch code defining a Conv2D block with batch normalization, ReLU, and max pooling.