Week 1: Complexity Analysis, Arrays & Linked Lists
Master fundamental data structures — arrays, trees, graphs, hash tables — and algorithmic complexity analysis for writing efficient, scalable code.
- Analyze algorithmic complexity with Big-O notation
- Implement and use arrays, linked lists, stacks, and queues
- Design recursive algorithms with memoization
- Apply sorting algorithms and understand their trade-offs
This first lecture establishes the foundational framework for Data Structures & Algorithms. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.
Key Concepts
The lecture introduces the four main pillars of this course: Big-O Complexity Analysis, Arrays, Linked Lists, Stacks & Queues, Recursion & Dynamic Programming, Sorting: Merge, Quick, Heap, Radix. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.
This Week's Focus
Focus on mastering: Big-O Complexity Analysis and Arrays, Linked Lists, Stacks & Queues. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.
PROG102 Project 1: Algorithm Performance Benchmarking Suite
Implement 5 sorting algorithms from scratch. Build a benchmarking suite that measures empirical runtime vs theoretical complexity on datasets of varying size, shape, and distribution.
- 5 sorting algorithm implementations (merge, quick, heap, radix, timsort)
- Benchmarking harness with matplotlib plots
- Complexity analysis report with empirical validation
- Edge case testing: sorted, reverse, duplicates, random
These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.
Explain why quicksort has O(n²) worst-case complexity but O(n log n) average complexity.
What is the difference between dynamic programming and memoization?
Write Python code implementing a min-heap with push() and pop() operations.