🎓 University of Aliens — Course Portal
🤖 Artificial Intelligence Week 1 of 14 BSc · Y2 ⏱ ~50 min

Week 1: Perceptrons, Backpropagation & Learning Dynamics

Understand the building blocks of neural networks from first principles: perceptrons, backpropagation mathematics, activation functions, and optimization dynamics.

UA
University of Aliens
AI203 — Lecture 1 · BSc Y2
🎬 CC Licensed Lecture
0:00 / —:—— 📺 MIT OpenCourseWare (CC BY-NC-SA)
🎯 Learning Objectives
  • Derive the backpropagation algorithm from chain rule calculus
  • Implement a 2-layer network in pure NumPy
  • Understand the role of different activation functions
  • Debug common neural network training problems
Topics Covered This Lecture
Perceptron Learning Algorithm
MLP: Forward & Backward Pass
Activation Functions: ReLU, Sigmoid, Tanh
Optimizer Dynamics: SGD, Momentum, Adam
📖 Lecture Overview

This first lecture establishes the foundational framework for Neural Networks Fundamentals. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.

Why this matters Understand the building blocks of neural networks from first principles: perceptrons, backpropagation mathematics, activation functions, and optimization dynamics. This lecture sets up everything that follows — make sure you understand the core concepts before proceeding to Week 2.

Key Concepts

The lecture introduces the four main pillars of this course: Perceptron Learning Algorithm, MLP: Forward & Backward Pass, Activation Functions: ReLU, Sigmoid, Tanh, Optimizer Dynamics: SGD, Momentum, Adam. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.

# Quick Start: verify your environment is ready for AI203 import sys print(f"Python {sys.version}") # Check key libraries are installed try: import numpy, pandas, matplotlib print("✅ Core libraries ready") except ImportError as e: print(f"❌ Missing: {e} — run: pip install numpy pandas matplotlib")

This Week's Focus

Focus on mastering: Perceptron Learning Algorithm and MLP: Forward & Backward Pass. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.

📋 Project 1 of 3 50% of Final Grade

AI203 Project 1: Neural Network from Pure NumPy

Build a full multi-layer neural network using only NumPy: forward pass, backpropagation, mini-batch SGD. Train on MNIST and achieve >95% accuracy. No PyTorch or TensorFlow allowed.

  • NumPy neural network (2+ layers, configurable)
  • Backpropagation implementation with gradient checking
  • Training curves: loss and accuracy vs epoch
  • Architecture ablation study (width, depth, activation)
50%
3 Projects
20%
Midterm Exam
30%
Final Exam
📝 Sample Exam Questions

These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.

Conceptual Short Answer

Derive ∂L/∂W for a fully connected layer using the chain rule. Show all intermediate steps.

Analysis Short Answer

Why does the sigmoid activation function cause vanishing gradients in deep networks?

Applied Code / Proof

Compare SGD, SGD+momentum, and Adam. What problem does each optimizer address?