Fundamentals 18 min read

Master Stationary Time Series & ARMA Models: Theory, Examples, Python Code

This article explains the fundamentals of weakly stationary time series, defines mean, variance, autocovariance, and autocorrelation functions, introduces AR, MA, ARMA, and ARIMA models, discusses model identification using ACF/PACF, selection criteria like AIC/SBC, diagnostic testing, and provides Python statsmodels code examples for implementation.

Model Perspective
Model Perspective
Model Perspective
Master Stationary Time Series & ARMA Models: Theory, Examples, Python Code

Stationary Time Series: Basic Concepts and Theory

Stationarity here refers to weak (wide‑sense) stationarity, meaning the mean and covariance of the stochastic process do not change with time shifts.

Mean, Variance, and Standard Deviation Functions

Definition 1: For a random process X(t), the mean function μ(t)=E[X(t)] and the variance function σ²(t)=Var[X(t)] are defined as functions of time. The square root of the variance function is the standard‑deviation function, measuring deviation from the mean.

Autocovariance and Autocorrelation Functions

Definition 2: For a random process X(t) and a fixed lag τ, the autocovariance function γ(τ)=Cov[X(t),X(t+τ)] characterises correlation between two time points. Normalising γ(τ) yields the autocorrelation function ρ(τ).

Stationary Random Sequences

Definition 3: A random sequence {X_t} is stationary if (1) its mean is constant and (2) its autocovariance function depends only on the lag, not on t.

White‑Noise Stationary Sequence

Definition 4: A stationary sequence whose autocovariance is zero for any non‑zero lag is a white‑noise sequence; its variance is constant.

Random Linear Sequence

Definition 5: If ε_t is zero‑mean stationary white noise and {a_k} is a coefficient sequence, the series X_t = Σ a_k ε_{t‑k} is a random linear (or Green’s‑function) sequence and is stationary.

Partial Autocorrelation Function

Definition 6: For a zero‑mean stationary series, the partial autocorrelation function (PACF) measures the direct correlation between X_t and X_{t‑k} after removing the indirect effects of intermediate lags.

ARMA Model

ARMA time‑series models are of three types:

AR (Auto‑Regressive) model

MA (Moving‑Average) model

ARMA (Auto‑Regressive Moving‑Average) model

AR(p) Sequence

Let {X_t} be a zero‑mean stationary series satisfying X_t = Σ_{i=1}^p φ_i X_{t‑i} + ε_t, where ε_t is zero‑mean stationary white noise. φ = (φ_1,…,φ_p) is the AR‑parameter vector.

MA(q) Sequence

Let {X_t} be a zero‑mean stationary series satisfying X_t = Σ_{j=0}^q θ_j ε_{t‑j}, where ε_t is zero‑mean stationary white noise and θ = (θ_1,…,θ_q) is the MA‑parameter vector.

ARMA(p,q) Sequence

Combining the AR(p) and MA(q) specifications yields X_t = Σ_{i=1}^p φ_i X_{t‑i} + Σ_{j=0}^q θ_j ε_{t‑j}. The model is stationary when the roots of the AR polynomial lie outside the unit circle and invertible when the MA roots do.

Model Identification and Order Selection

Identification uses the sample autocorrelation function (ACF) and partial autocorrelation function (PACF). For an AR(p) series, the PACF cuts off after lag p while the ACF tails off; for an MA(q) series, the ACF cuts off after lag q while the PACF tails off.

Typical identification rules are illustrated by the following figures:

ACF tails, PACF cuts off at lag 1 → AR(1)

ACF cuts off at lag 1, PACF tails → MA(1)

ACF tails, PACF cuts off at lag 2 → AR(2)

AIC and SBC Model Selection

When several candidate models fit, the Akaike Information Criterion (AIC) and Schwarz Bayesian Criterion (SBC) balance goodness‑of‑fit against model complexity: lower values indicate a preferred model.

Diagnostic Checking of Stationary Models

White‑Noise Test for Residuals

After fitting, residuals should behave as white noise. The Ljung‑Box Q‑statistic tests for remaining autocorrelation; failure to reject the null hypothesis indicates an adequate model.

Parameter Estimation for ARMA Models

Methods include method‑of‑moments, least squares, conditional least squares, and maximum likelihood. The following Python snippet shows a simple estimation using statsmodels :

<code>import numpy as np
data = np.random.normal(5,10,100)
</code>

Plotting ACF and PACF:

<code>from statsmodels.graphics.tsaplots import plot_acf, plot_pacf
plot_acf(data); plot_pacf(data)
</code>

Non‑Stationary Time Series

Non‑stationary series can be rendered stationary by differencing. First‑order differencing removes linear trends; higher‑order differencing handles curvature; seasonal differencing removes periodic components.

ARIMA Model

After differencing, an ARMA model can be applied, yielding the ARIMA(p,d,q) framework. Special cases: d=0 gives ARMA, d=1 with p=0 gives IMA, etc.

Statsmodels Implementation of ARMA Models

Example using simulated data:

<code>import numpy as np
data = np.random.normal(5,10,100)
</code>

Fit an AR(1) model:

<code>from statsmodels.tsa.arima.model import ARIMA
model = ARIMA(data, order=(1,0,0))
model_fit = model.fit()
yhat = model_fit.predict()
</code>

Similar code applies for MA(1), ARMA(2,1), and ARIMA models.

AIC‑Based Order Selection

Loop over candidate orders and select the one with minimum AIC:

<code>for p in range(1,7):
    model = ARIMA(data, order=(p,0,0))
    model_fit = model.fit()
    print(p, model_fit.aic)
</code>

The smallest AIC (≈782.2) occurs at p = 5, suggesting a fifth‑order AR model.

These sections provide a comprehensive overview of stationary time‑series concepts, ARMA/ARIMA modeling, identification, estimation, and practical implementation with Python.

Pythonforecastingtime seriesstatsmodelsARMAstationarity
Model Perspective
Written by

Model Perspective

Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.