In 
[1, p. 60 exercise 3.1] a 

 real random vector 

 is given , which is distributed according to a multivatiate Gaussian PDF with zero mean and covariance matrix: 
We are asked to find 

 if 

 where 

 is given by the relation: 
so that 

 and 

 are uncorrelated and hence independent. We are also asked to find the Cholesky decomposition of 

 which expresses 

 as 

, where 

 is lower triangular with 1′s on the principal diagonal and 

 is a diagonal matrix with positive diagonal elements. 
 Solution: 
First we note that 
From the previous relation we find that 
![\mathbf{L}^{-1} = \left[ \begin{array}{cc} 1 & 0 \\ \alpha & 1 \end{array} \right]](https://lysario.de/wp-content/cache/tex_ba8da6f3a962f34c66efc4dfe1f0c44f.png)
. Furthermore we note that 

, because the mean of 

 equals zero: 

 . The correlation matrix of 

 can be determined by: 
We can determine 

 in order for 

 and 

 to be uncorrelated. The two variables are uncorrelated when the off diagonal elements are zero. Thus for 

  the variables 

 are uncorrelated, which is only possible when 

. Because the random variables 

 are real the condition 

 is always fulfilled. 
For 

 the matrix 

 can be rewritten as: 
We know that Gaussian random variables are independent if they are uncorrelated 
[2, p. 154-155]. So far we have only found a condition for 

 in order to render 

 uncorrelated. From 
[1, p.42] we know that the random variables 

 are also Gaussian as they are obtained by a linear transform from the Gaussian random variables 

 . Thus we have found the condition for 

 in order for 

 to be uncorrelated and thus independent. 
Now let us proceed to find the Cholesky decomposition of 

. We first assume that 

 are such that 

 is positive definite, a condition which is necessary to decompose the correlation matrix 

 by Cholesky’s decomposition:  
where the elements of the matrix 

, assuming 

 are the elements of the matrix 

, are given by 
[1, p.30, (2.55)] (see also 
[3,   relations (10)  and  (11)  ]): 
 and the elements of the matrix 

 are given by the relation: 
The corresponding elements of the matrices 

 and 

 are thus given by: 
Thus the matrices 

 are given by: 
And the matrix 

 is obtained by the Cholesky decomposition as 

. 
While, considering 
[1, p. 23, (2.29)] that for two matrices 

,  the matrix 

 is given by:
 [1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X. [2] Papoulis, Athanasios: “Probability, Random Variables, and Stochastic Processes”, McGraw-Hill. [3] Chatzichrisafis: “Solution of exercise 2.12 from Kay’s Modern Spectral Estimation -
	Theory and Applications”.
 
Leave a reply