Week 1: Probability Spaces, Random Variables & Bayesian Inference
Deep dive into probability: random variables, joint distributions, Bayesian inference, and stochastic processes used throughout ML.
- Define probability spaces, events, and axioms formally
- Work with joint, marginal, and conditional distributions
- Apply Bayes' theorem to posterior inference
- Understand the law of large numbers and CLT
This first lecture establishes the foundational framework for Probability Theory. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.
Key Concepts
The lecture introduces the four main pillars of this course: Probability Axioms & Spaces, Conditional Probability & Bayes, Joint & Marginal Distributions, Limit Theorems & CLT. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.
This Week's Focus
Focus on mastering: Probability Axioms & Spaces and Conditional Probability & Bayes. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.
STAT201 Project 1: Bayesian Inference Simulation
Implement a Bayesian updating simulation for a coin-flip experiment. Visualize prior, likelihood, and posterior distributions as evidence accumulates.
- Bayesian update implementation in Python
- Animated visualization of prior→posterior shift
- Comparison of 3 different prior distributions
- Written explanation of the Bayesian workflow
These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.
State Bayes' theorem and explain each term (prior, likelihood, posterior, evidence).
What does the Central Limit Theorem guarantee, and why is it important in practice?
Two fair dice are rolled. What is the probability that the sum equals 7 given that the first die shows 3?