Lysario – by Panagiotis Chatzichrisafis

"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

Archive for the ‘Solved Problems’ Category

In [1, p. 60 exercise 3.1] a 2 \times 1 real random vector \mathbf{x} is given , which is distributed according to a multivatiate Gaussian PDF with zero mean and covariance matrix:
\mathbf{C}_{xx}=
\left[
\begin{array}{cc} 
\sigma_{1}^2 & \sigma_{12} \\ 
\sigma_{21} & \sigma_{2}^2
\end{array}\right]

We are asked to find \alpha if \mathbf{y}=\mathbf{L}^{-1}\mathbf{x} where \mathbf{L}^{-1} is given by the relation:

y_{1}=x_{1}
y_{2}=\alpha x_{1}+x_{2}

so that y_{1} and y_{2} are uncorrelated and hence independent. We are also asked to find the Cholesky decomposition of \mathbf{C}_{xx} which expresses \mathbf{C}_{xx} as \mathbf{L}\mathbf{D}\mathbf{L}^{T}, where \mathbf{L} is lower triangular with 1′s on the principal diagonal and \mathbf{D} is a diagonal matrix with positive diagonal elements. read the conclusion >
In [1, p. 35 exercise 2.18] we are asked to prove that the inverse of a complex matrix \mathbf{A}=\mathbf{A}_{R}+j\mathbf{A}_{I} may be found by first inverting
\mathbf{V}=\left(
\begin{array}{cc}
\mathbf{A}_{R}  &  -\mathbf{A}_{I}    \\
 \mathbf{A}_{I}   &   \mathbf{A}_{R}   \\
\end{array} 
\right) (1)

to yield
\mathbf{V}^{-1}=\left(
\begin{array}{cc}
\mathbf{B}_{R}  &  -\mathbf{B}_{I}    \\
 \mathbf{B}_{I}   &   \mathbf{B}_{R}   \\
\end{array} 
\right) (2)

and then letting \mathbf{A}^{-1}=\mathbf{B}_{R}+j\mathbf{B}_{I}. read the conclusion >
In [1, p. 35 exercise 2.17] we are asked to verify the alternative expression [1, p. 33 (2.77)] for a hermitian function. read the conclusion >
In [1, p. 35 exercise 2.16] we are asked to verify the formulas given for the complex gradient of a hermitian and a linear form [1, p. 31 (2.70)]. To do so, we are instructed to decompose the matrices and vectors into their real and imaginary parts as \mathbf{x}=\mathbf{x}_{R}+j\mathbf{x}_{I}, \mathbf{A^{\prime}}= \mathbf{A}_{R}+j\mathbf{A}_{I}, \mathbf{b^{\prime}}=\mathbf{b}_{R}+j\mathbf{b}_{I} read the conclusion >
In [1, p. 35 exercise 2.15] we are asked to verify the formulas given for the gradient of a quadratic and linear form [1, p. 31 (2.61)]. The corresponding formulas are
\frac{\partial}{\partial \mathbf{x}}(\mathbf{x}^T\mathbf{A}^{\prime}\mathbf{x})=2\mathbf{A}^{\prime}\mathbf{x} (1)

and
\frac{\partial}{\partial \mathbf{x}}(\mathbf{b}^{\prime T}\mathbf{x})=\mathbf{b}^{\prime} (2)

where \mathbf{A}^{\prime} is a symmetric n \times n matrix with elements a_{ij} and \mathbf{b}^{\prime} is a real n \times 1 vector with elements b_{i} and \frac{\partial}{\partial \mathbf{x}} denotes the gradient of a real function in respect to \mathbf{x}. read the conclusion >

Recent Comments