In
[1, p. 60 exercise 3.2] we are asked to proof by using the method of characteristic functions that the sum of squares of N independent and identically distributed N(0,1) random variables has a
![\chi^{2}_{N}](https://lysario.de/wp-content/cache/tex_d55c4ecd62a14216e9f6068f6793d47e.png)
distribution.
Solution:
The
![\chi_{N}^{2}](https://lysario.de/wp-content/cache/tex_b4424e76dde6955c87f3cbd577a8b1f6.png)
distribution is by definition
[2, p.302] the distribution of the sum of squares of N independent and identically distributed random variables and even
[1, p. 43] doesn’t seem to give a different definition. By using this definition the solution of this exercise would be indeed really short. In the solution provided in the following text we will assume that the
![\chi^{2}_{N}](https://lysario.de/wp-content/cache/tex_d55c4ecd62a14216e9f6068f6793d47e.png)
distribution is defined by
[1, p.43, (3.7)], and we will show that the distribution can be derived also from the sum of N independent and identically distributed N(0,1) random variables by using characteristic functions.
First let us reproduce the probability density function of a variable
![y](https://lysario.de/wp-content/cache/tex_415290769594460e2e485922904f345d.png)
that is distributed according to a
![\chi^{2}_{N}](https://lysario.de/wp-content/cache/tex_d55c4ecd62a14216e9f6068f6793d47e.png)
distribution from
[1, p.43, (3.7)]:
We note the close relation to the pdf of a random variable
![u](https://lysario.de/wp-content/cache/tex_7b774effe4a349c6dd82ad4f4f21d34c.png)
with gamma distribution
[3, p.79, (4.38)]:
which is equal to the
![\chi^{2}](https://lysario.de/wp-content/cache/tex_2f728dbcde765f5a357bce9752c0d878.png)
distribution for
![c=\frac{1}{2}](https://lysario.de/wp-content/cache/tex_07fcba9c668d0351c38e28fe2aedead0.png)
and
![b=N/2](https://lysario.de/wp-content/cache/tex_4df5822598cfe359937ceb305ead9431.png)
.
The characteristic function of a distribution with probability density function (pdf)
![f(x)](https://lysario.de/wp-content/cache/tex_50bbd36e1fd2333108437a2ca378be62.png)
is given by
[3] :
and can be derived from the moment generating function
![\Phi_{x}(s)](https://lysario.de/wp-content/cache/tex_c2dc6582a047089744788f729465c2bc.png)
.
The moment generating function of the
![\chi^{2}](https://lysario.de/wp-content/cache/tex_2f728dbcde765f5a357bce9752c0d878.png)
distribution is given by
[3, p.117, (5.71)]: .
And by replacing
![s](https://lysario.de/wp-content/cache/tex_03c7c0ace395d80182db07ae2c30f034.png)
with
![j\omega](https://lysario.de/wp-content/cache/tex_fc2e5bb0b4449c354116691ac21cd93b.png)
we obtain from (
2) the characteristic function of the pdf of the chi square distribution:
Now let us proceed to obtain the characteristic function of the random variable
![y](https://lysario.de/wp-content/cache/tex_415290769594460e2e485922904f345d.png)
which we define to be that random variable that is obtained by the sum of squares of
![N](https://lysario.de/wp-content/cache/tex_8d9c307cb7f3c4a32822a51922d1ceaa.png)
independent and identically distributed N(0,1) random variables by using gaussian distributons. The variable
![y](https://lysario.de/wp-content/cache/tex_415290769594460e2e485922904f345d.png)
is given by:
with
![\mathbf{x} = \left[ \begin{array}{ccc}x_{0} &,..., & x_{N-1}\end{array} \right]^{T}](https://lysario.de/wp-content/cache/tex_0f7c6a342e20d6e512dcafa8275cb4d2.png)
.
In order to compute the characteristic function of
![y_{i}=x^{2}_{i}](https://lysario.de/wp-content/cache/tex_f75838b9f200bb3c9f05a497e2cf5ef7.png)
we will need the following theorem from
[3, (5.32), p. 106]:
The characteristic function of
![\Phi_{y_{i}}(j \omega)=E\{e^{j \omega y_{i}}\}](https://lysario.de/wp-content/cache/tex_3eec80fbd4764ecf33653eb5a37e276e.png)
can thus be computed by (
5).
The integral
![\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}e^{ (j\omega -\frac{1}{2}) \cdot x_{i}^{2} } dx_{i}](https://lysario.de/wp-content/cache/tex_c516b771d515856822e4077524bd66b1.png)
was evaluated using the integral tables of
[4, p. 992]. We note that the joint probability density function
![f(\mathbf{x})](https://lysario.de/wp-content/cache/tex_64eed95fd6bcd43a2620f10ea19113ab.png)
of
![\mathbf{x}](https://lysario.de/wp-content/cache/tex_70e59a996bd69a0c21878b4093375e92.png)
is given by the product of the individual pdf’s of the normal distributed random variables
![x_{i}](https://lysario.de/wp-content/cache/tex_05e42209d67fe1eb15a055e9d3b3770e.png)
, with
![f_{i}(x_{i})=\frac{1}{\sqrt{2\pi}}e^{\frac{-x_{i}^{2}}{2}}](https://lysario.de/wp-content/cache/tex_05a6ca815f74930e79476ed29b41bec8.png)
because the
![N](https://lysario.de/wp-content/cache/tex_8d9c307cb7f3c4a32822a51922d1ceaa.png)
random variables are independent:
Thus the characteristic function of
![y=\sum_{i=0}^{N-1}x_{i}^{2}](https://lysario.de/wp-content/cache/tex_9fb0fb2d635c778026f9cd72e2cba62c.png)
is given by
[3, p. 158]:
which by using (
6) is equal to:
By comparing the equation (
3) with (
8) we see that the same characteristic function is obtained by either using the probability density function as given by (
1) or by computing the characteristic function using the squared sum of
![N(0,1)](https://lysario.de/wp-content/cache/tex_a263792f24431eedaec005a80696abb0.png)
gaussian distributed random variables in conjunction with theorem (
5). By virtue of the uniqueness theorem of Fourier transform pairs – the characteristic functions of a pdf is also a Fourier transform – we obtain the result that the probability density function of the squared sum of
![N(0,1)](https://lysario.de/wp-content/cache/tex_a263792f24431eedaec005a80696abb0.png)
gaussian distributed random variables is equal to (
1).
QED.
[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X. [2] Fahrmeir and Künstler and Pigeot and Tutz: “Statistik”, Springer. [3] Papoulis, Athanasios: “Probability, Random Variables, and Stochastic Processes”, McGraw-Hill. [4] Bronstein and Semdjajew and Musiol and Muehlig: “Taschenbuch der Mathematik”, Verlag Harri Deutsch Thun und Frankfurt am Main, ISBN: 3-8171-2003-6.
2 Responses for "Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”,p. 60 exercise 3.2"
do you have the complete solution mannual of modern spectral estimation?
and may I get it?
Hi,
Morry, i am currently working towards providing a complete solution manual for Kay’s “Modern Spectral Estimation”.
I am posting the solutions of problems as soon as they are transferred into electronic format. I am planning also to provide a PDF document when all solutions are
transferred into electronic format.
Leave a reply