Cross-covariance matrix
From Wikipedia, the free encyclopedia
| Part of a series on Statistics |
| Correlation and covariance |
|---|
In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. When the two random vectors are the same, the cross-covariance matrix is referred to as covariance matrix. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The cross-covariance matrix of two random vectors and is typically denoted by or .
For random vectors and , each containing random elements whose expected value and variance exist, the cross-covariance matrix of and is defined by[1]: 336
| Eq.1 |
where and are vectors containing the expected values of and . The vectors and need not have the same dimension, and either might be a scalar value.
The cross-covariance matrix is the matrix whose entry is the covariance
between the i-th element of and the j-th element of . This gives the following component-wise definition of the cross-covariance matrix.
Example
For example, if and are random vectors, then is a matrix whose -th entry is .
Properties
For the cross-covariance matrix, the following basic properties apply:[2]
- If and are independent (or somewhat less restrictedly, if every random variable in is uncorrelated with every random variable in ), then
where , and are random vectors, is a random vector, is a vector, is a vector, and are matrices of constants, and is a matrix of zeroes.