In
[1, p. 34 exercise 2.9] we are asked to prove that the rank of the complex
matrix
where the
are linearly independent complex
vectors and the
‘s are real and positive, is equal to
if
. Furthermore we are asked what the rank equals to if
>
.
Solution:
Let
where
and
. If
is the linear transform associated to
then
can be written as the composite of the linear transformations
[2, propsition 6.3, p. 41] and
which are respectively associated to the matrices
and
,
.
For
the
linear independent vectors span the subspace
[2, proposition 4.3, page 112] thus the column rank of the matrix
is
. The matrix
has the same rank as the matrix
. The proof of this statement can be obtained by the observation that given
linear independent vectors, then the complex conjugates of those vectors are also linear independent. Thus
. Furthermore because the column rank equals the the row rank of a matrix
[2, Theorem 4.4, page 218] we have
.
The linear transformation
is surjective as its
.
is bijective as the diagonal matrix associated with it is invertible
[2, proposition 2.3, page 209]. The composite
is thus surjective (onto) to
and thus the dimension of the image obtained by
(per definition the rank of the matrix) is the same as the one that would be obtained by
by linear combinations of a basis in
. Thus
. QED.
For
>
there can be no
linear independent
vectors. The maximum number of linear independent
vectors is simply
. This assertion can be proven by the equality of the row and column rank of a matrix.
For in this case the column rank of
would be
while the row rank of
would not be larger than
. The rank of the matrix cannot be larger than
in this case.
[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X. [2] Lawrence J. Corwin and Robert H. Szczarba: “Calculus in Vector Spaces”, Marcel Dekker, Inc, 2nd edition, ISBN: 0824792793.
Leave a reply