7 AR and MA Models
Settling In
HW4 Setup
Github Setup
Go to https://bcheggeseth.github.io/452_fall_2025/homework/hw-4.html
We are going to work in Github for the rest of the HW’s to facilitate collaboration with the mini projects.
Highlights from Day 6
Random Walk
A random walk is NOT stationary.
- Non-constant variance, covariance not a function solely of lag, \(h\).
- Random walk is an autoregressive (AR) model, the current value is based on a linear model of past values, \(Y_t = \phi_1Y_{t-1} + W_t\) with \(\phi_1=1\).
. . .
If we have only one realization of a random process that is not stationary:
- It is hard to tell whether or not it is stationary (we do our best to deal with obvious trend and non-constant variance)
- If we assume stationarity, the estimated ACF from
acf()
will have high values at large lags - If we know how the data were generated, we could prove it mathematically or generate many realizations to show it isn’t stationary.
Model Components
- Model Trend
- Estimate & Remove Trend
- Goal: Detrended data is on average 0 (ignoring seasonality)
. . .
- After removing trend, Model Seasonality (regular repeating patterns)
- Estimate & Remove Seasonality
- Goal: After removing seasonality, data is on average 0 (with no cyclic patterns)
. . .
- Model Noise (what is left over)
- Typically correlated
- Need models that incorporate correlation between neighboring observations
- AR: Autoregressive Models
- MA: Moving Average Models
- ARMA: Autoregressive, Moving Average Models
Learning Goals
- Understand the derivations of variance and covariance for the AR(1) and MA(1) model.
- Understand the notation for an AR(p) and MA(q) models and the general mathematical approaches for deriving variance, covariance, and correlation.
- Recognize general patterns of non-stationarity, AR(p) models, and MA(q) models in example ACF graphs.
Notes: AR and MA Models
Warm Up
We’ll walk through the derivations for the MA(1) model as a class to warm up.
If we can understand the theoretical patterns of this type of model, we’ll be better prepared to look at data and see whether or not the model might closely match reality.
MA(1)
The MA(1) Model is
\[Y_t = W_t + \theta_1W_{t-1}\quad\text{ where }W_{t}\stackrel{iid}{\sim} N(0,\sigma^2_w), |\theta_1|<1\]
Find the Expected value of \(Y_t\).
Find the Variance of \(Y_t\).
Find the Covariance of \(Y_t\) and \(Y_{t-h}\).
Small Group Work
Group Theory
Working at the board, work through an alternative derivation for the AR(1) model.
Explicitly discuss with your partner how you’d like to collaborate today. It is your responsibility to make sure both you and your partner understand the derivation and that you are positively supporting each other in the process.
AR(1)
- Write down the model for an AR(1) Model.
\[Y_t = \]
- Rewrite the model by iteratively plugging in the value from the model for \(Y_{t-1}\) and then \(Y_{t-2}\). Keep going, to write \(Y_{t}\) as \(\sum^{\infty}_{k=0} \phi_1^k W_{t-k}\) allowing the time indices to range from \(-\infty,...,-3,-2,-1,0,1,2,3,...,\infty\)
For 7-8, use the infinite geometric series, \(\sum^{\infty}_{k=0} r^k = (1-r)^{-1}\text{ if }|r|<1\).
Find the Expected value of \(Y_t\).
Find the Variance of \(Y_t\).
Find the Covariance of \(Y_t\) and \(Y_{t-h}\).
Generalization
AR(p) and MA(q)
AR(p) Model
\[Y_t = \phi_1Y_{t-1} + \cdots + \phi_{t-p}Y_{t-p} + W_t\quad\text{ where }W_{t}\stackrel{iid}{\sim} N(0,\sigma^2_w)\]
MA(q) Models
\[Y_t = W_t + \theta_1W_{t-1}+\cdots + \theta_qW_{t-q}\quad\text{ where }W_{t}\stackrel{iid}{\sim} N(0,\sigma^2_w)\]
These are stationary if the \(\phi\)’s and \(\theta\)’s take certain values (we’ll talk more about this later…)
Solutions
Warm Up
MA(1)
- .
Solution
\[E(Y_t) = E(W_t + \theta_1W_{t-1})\] \[= E(W_t) + \theta_1E(W_{t-1}) =0\]- .
Solution
\[Var(Y_t) = Var(W_t + \theta_1W_{t-1})\] \[= Var(W_t) + \theta_1^2Var(W_{t-1})\] \[= \sigma^2_w + \theta_1^2\sigma^2_w = \sigma^2_w(1+\theta^2_1)\]- .
Solution
\[Cov(Y_t,Y_{t-h}) = Cov(W_t + \theta_1W_{t-1},W_{t-h} + \theta_1W_{t-h-1})\] \[= Cov(W_t,W_{t-h}) + Cov(\theta_1W_{t-1},W_{t-h}) + Cov(W_t,\theta_1W_{t-h-1})+ Cov(\theta_1W_{t-1},\theta_1W_{t-h-1})\] If \(h = 1\), \[= 0 + Cov(\theta_1W_{t-1},W_{t-1}) + 0 + 0 = \theta_1\sigma^2_w\]
If \(h > 1\), \[= 0 + 0 + 0 + 0 = 0\]
Group Theory
AR(1)
- .
Solution
\[Y_t = \phi_1 Y_{t-1} + W_t\text{ where } W_t\sim N(0,\sigma^2_w)\]
- .
Solution
\[Y_t = \phi_1 (\phi_1 Y_{t-2} + W_{t-1}) + W_t\text{ where } W_t\sim N(0,\sigma^2_w)\] \[= \phi_1^2 Y_{t-2} + \phi_1W_{t-1} + W_t\text{ where } W_t\sim N(0,\sigma^2_w)\] \[= \phi_1^2 (\phi_1Y_{t-3} + W_{t-2}) + \phi_1 W_{t-1} + W_t\text{ where } W_t\sim N(0,\sigma^2_w)\] \[= \phi_1^3 (\phi_1Y_{t-4} + W_{t-3}) + \phi_1^2 W_{t-2} + \phi_1 W_{t-1} + W_t\text{ where } W_t\sim N(0,\sigma^2_w)\] \[= \sum^\infty_{k=0} \phi_1^k W_{t-k} \text{ where } W_t\sim N(0,\sigma^2_w)\]
- .
Solution
\[E(Y_t) = E(\sum^\infty_{k=0} \phi_1^k W_{t-k})\] \[= \sum^\infty_{k=0} \phi_1^k E(W_{t-k}) = 0\]- .
Solution
\[Var(Y_t) = Var(\sum^\infty_{k=0} \phi_1^k W_{t-k})\] \[= \sum^\infty_{k=0} \phi_1^{2k} Var(W_{t-k})\] \[= \sum^\infty_{k=0} \phi_1^{2k} \sigma^2_w\] \[= \sigma^2_w \sum^\infty_{k=0} (\phi_1^2)^{k}\] \[= \sigma^2_w (1-\phi_1^2)^{-1}\text{ if } \phi_1^2 < 1\]- .
Solution
\[Cov(Y_t,Y_{t+h}) = Cov(\sum^\infty_{k=0} \phi_1^k W_{t-k},\sum^\infty_{k=0} \phi_1^k W_{t-h-k}) \]
\[= \sum^\infty_{k=0}\sum^\infty_{j=0}\phi_1^k \phi_1^j Cov( W_{t-k}, W_{t-h-j})\] But, \(Cov( W_{t-k}, W_{t-h-j}) = 0\) unless \(k = h+j\), so
\[= \sum^\infty_{j=0}\phi_1^{h+j} \phi_1^j Cov( W_{t-h-j}, W_{t-h-j})\] \[= \sum^\infty_{j=0}\phi_1^{h+2j} \sigma^2_w\]
\[= \sigma^2_w\phi_1^h\sum^\infty_{j=0}(\phi_1^2)^{j}\] \[= \sigma^2_w\phi_1^h(1-\phi_1^2)^{-1} \text{ if } \phi_1^2 < 1\]Wrap-Up
Finishing the Activity
- If you didn’t finish the activity, no problem! Be sure to complete the activity outside of class, review the solutions in the online manual, and ask any questions on Slack or in office hours.
- Re-organize and review your notes to help deepen your understanding, solidify your learning, and make homework go more smoothly!
After Class
Before the next class, please do the following:
- Take a look at the Schedule page to see how to prepare for the next class.