Week 1: ARIMA Models, Stationarity & Forecasting
Master ARIMA modeling, LSTM for sequences, anomaly detection algorithms, and forecasting for real business applications.
- Test for and achieve stationarity using differencing and transformations
- Fit ARIMA and SARIMA models with ACF/PACF analysis
- Build LSTM-based sequence models for time series
- Detect anomalies using statistical and ML approaches
This first lecture establishes the foundational framework for Time Series Analysis. By the end of this session, you will have the conceptual grounding and practical starting point needed for the rest of the course.
Key Concepts
The lecture introduces the four main pillars of this course: Stationarity & Differencing, ARIMA: AR, I, MA Components, Seasonal Decomposition (STL), LSTM for Sequences & Anomaly Detection. Each will be explored in depth over the 14-week curriculum, with hands-on projects reinforcing theory at every stage.
This Week's Focus
Focus on mastering: Stationarity & Differencing and ARIMA: AR, I, MA Components. These are the prerequisites for everything in Week 2. The concepts build on each other — do not skip the practice exercises.
DS304 Project 1: Sales Forecasting System
Build a forecasting system for monthly retail sales data. Compare ARIMA, Prophet, and LSTM approaches. Evaluate with MAPE, RMSE, and directional accuracy.
- Stationarity tests (ADF, KPSS) and transformations
- ARIMA model with ACF/PACF justification
- LSTM sequence model implementation
- Forecast comparison dashboard with confidence intervals
These represent the style and difficulty of questions you'll see on the midterm and final. Start thinking about them now.
What does the Augmented Dickey-Fuller test check? What does it mean to fail the test?
In ARIMA(p,d,q), what do the three parameters represent?
Why might LSTM outperform ARIMA for a time series with non-linear seasonal patterns?