4 Modeling Covariance

Learning Goals

  • Explain and illustrate how we can model covariance by constraining a covariance matrix.
  • Explain and illustrate how we can model covariance by constraining an autocovariance function.
  • Implement covariance model estimation by assuming stationarity (ACF and Semi-Variogram).




Warm Up

The covariance matrix is a symmetric matrix of all pairwise covariances, organized into a matrix form. The \(ij\)th element (\(i\)th row, \(j\)th column) of the matrix is \(Cov(Y_i,Y_j) = \sigma_{ij}\).

The correlation matrix is a symmetric matrix of all pairwise correlations, organized into a matrix form. The \(ij\)th element (\(i\)th row, \(j\)th column) of the matrix is \(Cor(Y_i,Y_j) = \rho_{ij}\).

  • Come up with a list of properties that have to be true of each \(\sigma_{ij}\) and the covariance matrix \(\boldsymbol\Sigma\).

  • Come up with a list of properties that have to be true of each \(\rho_{ij}\) and the correlation matrix \(\mathbf{R}\).

Group Activity

Download a template RMarkdown file to start from here.

Theory

  1. As a group, come up with a list of additional properties that have to be true of each \(\sigma_{ij}\) and the covariance matrix if the random process is weakly stationary.

ANSWER:

  1. As a group, come up with a list of additional properties that have to be true of each \(\sigma_{ij}\) and the covariance matrix if the random process is isotropic.

ANSWER:

  1. Consider the semivariogram, defined as \(\gamma(i,j) = 0.5 Var(Y_i - Y_j)\). If the random process \(Y_t\) is stationary, what is the relationship between the semivariogram \(\gamma(i,j)\) and the autocovariance function \(\Sigma(i,j) = Cov(Y_i,Y_j)\). Hint: Try to write \(\gamma(i,j)\) in terms of \(\Sigma(i,j)\).

ANSWER: On a piece of paper

Conceptual Practice

Imagine a random process of length 100 where the constant variance is 4 and the correlation decays such that with lag 1 it is 0.9, with lag 2 it is \(0.9^2\), with lag 3 it is \(0.9^3\), etc.

  1. Sketch the covariance function, assuming stationarity. Label key features.

ANSWER: On a piece of paper

  1. Sketch the correlation function, assuming stationarity. Label key features.

ANSWER: On a piece of paper

  1. Artistically illustrate the structure of the covariance matrix. Label key features.

ANSWER: On a piece of paper

Now, we’ll get one realization from the random process.

times <- 1:100
D <- as.matrix(dist(times)) #100x100 matrix with values as lags between every possible value

COR <- 0.9^D #correlation

COV <- COR*4 #covariance (multiply by constant variance)

L <- t(chol(COV)) #cholesky decomposition

z <- rnorm(100)
y <- L %*% z #generate 100 correlated values

plot(times, y, type='l') # plot of the realizations of the process

What if we generated data based on the previous value instead of using correlation/covariance?

m <- 100
times <- 1:m
e <- rnorm(m)
y <- rep(0, m)

y[1] <- 1 # initial value

for(i in times[-1]){
  y[i] <- 0.5*y[i-1] + e[i]  # current based on past value plus error
}

plot(times, y, type='l') # plot of the realizations of the process
  1. Without using the functions acf() or cor(), try to write R code to estimate the autocorrelation function for one of the realized random processes above. Start by breaking the computation down into smaller tasks and then consider R code.

ANSWER:

  1. Without using any variogram function, try to write R code to estimate the semivariogram for one of the realized random processes above. Start by breaking the computation down into smaller tasks and then consider R code.

ANSWER: