Lysario – by Panagiotis Chatzichrisafis

"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

In [1, p. 35 exercise 2.18] we are asked to prove that the inverse of a complex matrix \mathbf{A}=\mathbf{A}_{R}+j\mathbf{A}_{I} may be found by first inverting
\mathbf{V}=\left(
\begin{array}{cc}
\mathbf{A}_{R}  &  -\mathbf{A}_{I}    \\
 \mathbf{A}_{I}   &   \mathbf{A}_{R}   \\
\end{array} 
\right) (1)

to yield
\mathbf{V}^{-1}=\left(
\begin{array}{cc}
\mathbf{B}_{R}  &  -\mathbf{B}_{I}    \\
 \mathbf{B}_{I}   &   \mathbf{B}_{R}   \\
\end{array} 
\right) (2)

and then letting \mathbf{A}^{-1}=\mathbf{B}_{R}+j\mathbf{B}_{I}. read the conclusion >
In [1, p. 35 exercise 2.17] we are asked to verify the alternative expression [1, p. 33 (2.77)] for a hermitian function. read the conclusion >
In [1, p. 35 exercise 2.16] we are asked to verify the formulas given for the complex gradient of a hermitian and a linear form [1, p. 31 (2.70)]. To do so, we are instructed to decompose the matrices and vectors into their real and imaginary parts as \mathbf{x}=\mathbf{x}_{R}+j\mathbf{x}_{I}, \mathbf{A^{\prime}}= \mathbf{A}_{R}+j\mathbf{A}_{I}, \mathbf{b^{\prime}}=\mathbf{b}_{R}+j\mathbf{b}_{I} read the conclusion >
In [1, p. 35 exercise 2.15] we are asked to verify the formulas given for the gradient of a quadratic and linear form [1, p. 31 (2.61)]. The corresponding formulas are
\frac{\partial}{\partial \mathbf{x}}(\mathbf{x}^T\mathbf{A}^{\prime}\mathbf{x})=2\mathbf{A}^{\prime}\mathbf{x} (1)

and
\frac{\partial}{\partial \mathbf{x}}(\mathbf{b}^{\prime T}\mathbf{x})=\mathbf{b}^{\prime} (2)

where \mathbf{A}^{\prime} is a symmetric n \times n matrix with elements a_{ij} and \mathbf{b}^{\prime} is a real n \times 1 vector with elements b_{i} and \frac{\partial}{\partial \mathbf{x}} denotes the gradient of a real function in respect to \mathbf{x}. read the conclusion >
In [1, p. 35 exercise 2.14] we are asked to consider the solution to the set of linear equations
\mathbf{R}\mathbf{x}=\mathbf{-r} (1)

where \mathbf{R} is a n \times n hermitian matrix given by:
\mathbf{R}=P_1\mathbf{e_1}\mathbf{e_1}^H+P_2\mathbf{e_2}\mathbf{e_2}^H (2)

and \mathbf{r} is a complex n \times 1 vector given by
\mathbf{r}=P_1 e^{j2\pi f_1}\mathbf{e_1}+P_2 e^{j2\pi f_2}\mathbf{e_2} (3)

The complex vectors are defined in [1, p.22, (2.27)]. Furthermore we are asked to assumed that f_1=k/n, f_2=l/n for k,l are distinct integers in the range [-n/2,n/2-1] for n even and [-(n-1)/2,(n-1)/2] for n odd. \mathbf{x} is defined to be a n \times 1 vector. It is requested to show that \mathbf{R} is a singular matrix (assuming  n>2 ) and that there are infinite number of solutions. A further task is to find the general solution and also the minimum norm solution of the set of linear equations. The hint provided by the exercise is to note that \mathbf{e}_1/\sqrt{n},\mathbf{e}_2/\sqrt{n} are eigenvectors of \mathbf{R} with nonzero eigenvalues and then to assume a solution of the form
\mathbf{x}=\xi_1 \mathbf{e}_1 + \xi_2 \mathbf{e}_2 + \sum\limits_{i=3}^{n}\xi_i \mathbf{e}_i (4)

where \mathbf{e_i}^H\mathbf{e_1}=0, \mathbf{e_i}^H\mathbf{e_2}=0 for i=3,4,...,n and solve for \xi_1, \xi_2. read the conclusion >

Recent Comments