4. Classical Time Series Models#
After mastering the data preparation techniques in previous chapters, we now turn to the core of time series forecasting. This chapter introduces the two dominant families of univariate forecasting models: Exponential Smoothing, which predicts future values based on weighted averages of the past, and ARIMA, which models the autocorrelation structure of the data.
In this chapter, you will learn the following topics:
Time Series Smoothing: Before diving into complex models, it is essential to understand how to filter noise from data to reveal underlying signals. This section covers fundamental smoothing techniques, such as Simple Moving Averages (SMA), explaining how window size selection affects the balance between responsiveness to recent changes and stability against noise.
Foundations of Exponential Smoothing: This section introduces the family of methods that assign exponentially decreasing weights to past observations. You will explore Simple Exponential Smoothing (SES) for data with no clear trend and Holt’s Linear Trend Method for data with a consistent direction. The section emphasizes the role of smoothing parameters (\(\alpha\) and \(\beta\)) in controlling how the model adapts to new information.
Advanced Exponential Smoothing: Building on the foundations, this section addresses seasonality using the Holt-Winters Method (Triple Exponential Smoothing). You will learn to distinguish between additive and multiplicative seasonality and effectively use the ETS (Error, Trend, Seasonal) framework to generate prediction intervals and statistically select the best model structure.
ARIMA Models: Moving to correlation-based modeling, this section breaks down the theoretical building blocks of ARIMA: AutoRegression (AR) for momentum, Moving Average (MA) for shock absorption, and Integration (I) for stability. You will learn the systematic workflow for identifying model orders (\(p, d, q\)) using diagnostic tools like ACF and PACF plots.
ARIMA Models in Practice: Theory meets reality in this section, where you will apply ARIMA models to real datasets such as Sunspot activity and the Consumer Price Index (CPI). It highlights the critical step of differencing to transform non-stationary data into stationary series, ensuring valid statistical inference and reliable forecasts.
SARIMA: Many real-world series, from airline passengers to retail sales, exhibit complex recurring cycles. This section extends the ARIMA framework to Seasonal ARIMA (SARIMA), explaining how to use seasonal differencing (\(D\)) and seasonal lag polynomials to capture these periodic patterns explicitly.
Auto-ARIMA: While manual model selection offers control, it can be time-consuming. This section introduces automated model selection algorithms that efficiently search the parameter space to minimize information criteria (like AIC). You will learn the advantages of automation for large-scale forecasting tasks and how to validate these automated models against manual baselines.