Week 1: Vectors, Matrices & Linear Transformations
Explore vectors, matrices, eigenvalues, and their applications — the language that every major ML algorithm is written in.
- Perform matrix multiplication and inversion
- Compute eigenvalues and eigenvectors
- Apply SVD to understand PCA
- Interpret linear transformations geometrically
This first lecture establishes the foundational framework for Linear Algebra. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.
Key Concepts
The lecture introduces the four main pillars of this course: Vectors & Matrix Operations, Elimination & Row Reduction, Determinants, Eigenvalues & PCA. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.
This Week's Focus
Focus on mastering: Vectors & Matrix Operations and Elimination & Row Reduction. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.
MATH102 Project 1: PCA on Image Data
Implement Principal Component Analysis from scratch using NumPy SVD on the MNIST digits dataset. Visualize explained variance and reconstruct images from different numbers of components.
- NumPy SVD-based PCA implementation
- Explained variance curve (scree plot)
- Image reconstruction at 5, 20, 50, 100 components
- Written explanation of the SVD-PCA connection
These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.
If A is an n×n matrix, what does det(A)=0 imply about its invertibility?
Explain geometrically what an eigenvector represents for a transformation matrix.
Write the NumPy code to compute SVD of matrix X and reconstruct it with the top-k components.