🎓 University of America — Course Portal
Data ScienceMATH102 › Week 1
📊 Data Science Week 1 of 14 BSc · Y1 S1 ⏱ ~50 min

Week 1: Vectors, Matrices & Linear Transformations

Explore vectors, matrices, eigenvalues, and their applications — the language that every major ML algorithm is written in.

UA
University of America
MATH102 — Lecture 1 · BSc Y1 S1
🎬 CC Licensed Lecture
0:00 / —:—— 📺 MIT OpenCourseWare (CC BY-NC-SA)
🎯 Learning Objectives
  • Perform matrix multiplication and inversion
  • Compute eigenvalues and eigenvectors
  • Apply SVD to understand PCA
  • Interpret linear transformations geometrically
Topics Covered This Lecture
Vectors & Matrix Operations
Elimination & Row Reduction
Determinants
Eigenvalues & PCA
📖 Lecture Overview

This first lecture establishes the foundational framework for Linear Algebra. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.

Why this matters Explore vectors, matrices, eigenvalues, and their applications — the language that every major ML algorithm is written in. This lecture sets up everything that follows — make sure you understand the core concepts before proceeding to Week 2.

Key Concepts

The lecture introduces the four main pillars of this course: Vectors & Matrix Operations, Elimination & Row Reduction, Determinants, Eigenvalues & PCA. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.

# Quick Start: verify your environment is ready for MATH102 import sys print(f"Python {sys.version}") # Check key libraries are installed try: import numpy, pandas, matplotlib print("✅ Core libraries ready") except ImportError as e: print(f"❌ Missing: {e} — run: pip install numpy pandas matplotlib")

This Week's Focus

Focus on mastering: Vectors & Matrix Operations and Elimination & Row Reduction. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.

📋 Project 1 of 3 50% of Final Grade

MATH102 Project 1: PCA on Image Data

Implement Principal Component Analysis from scratch using NumPy SVD on the MNIST digits dataset. Visualize explained variance and reconstruct images from different numbers of components.

  • NumPy SVD-based PCA implementation
  • Explained variance curve (scree plot)
  • Image reconstruction at 5, 20, 50, 100 components
  • Written explanation of the SVD-PCA connection
50%
3 Projects
20%
Midterm Exam
30%
Final Exam
📝 Sample Exam Questions

These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.

Conceptual Short Answer

If A is an n×n matrix, what does det(A)=0 imply about its invertibility?

Analysis Short Answer

Explain geometrically what an eigenvector represents for a transformation matrix.

Applied Code / Proof

Write the NumPy code to compute SVD of matrix X and reconstruct it with the top-k components.