In [1, p. 60 exercise 3.2] we are asked to proof by using the method of characteristic functions that the sum of squares of N independent and identically distributed N(0,1) random variables has a $\chi^{2}_{N}$ distribution.
Solution: The $\chi_{N}^{2}$ distribution is by definition [2, p.302] the distribution of the sum of squares of N independent and identically distributed random variables and even [1, p. 43] doesn’t seem to give a different definition. By using this definition the solution of this exercise would be indeed really short. In the solution provided in the following text we will assume that the $\chi^{2}_{N}$ distribution is defined by [1, p.43, (3.7)], and we will show that the distribution can be derived also from the sum of N independent and identically distributed N(0,1) random variables by using characteristic functions. First let us reproduce the probability density function of a variable $y$ that is distributed according to a $\chi^{2}_{N}$ distribution from [1, p.43, (3.7)]:
 $p(y)=\left\{ \begin{array}{cc} \frac{y^{(N/2 -1)} e^{(-\frac{y}{2})} }{ 2^{N/2} \Gamma(N/2) } & for \; y\geq 0 \\ 0 &for \; y< 0. \end{array}\right.$ (1)

We note the close relation to the pdf of a random variable $u$ with gamma distribution [3, p.79, (4.38)]:
 $f(u)=\left\{ \begin{array}{cc} \gamma u^{b-1} e^{-cu} , \; \gamma =\frac{c^{b}}{\Gamma(b)} & for \; y\geq 0\\ 0 &for \; y< 0. \end{array}\right.$

which is equal to the $\chi^{2}$ distribution for $c=\frac{1}{2}$ and $b=N/2$. The characteristic function of a distribution with probability density function (pdf) $f(x)$ is given by [3] :
 $\Phi_{x}(\omega)$ $=$ $E\{e^{j\omega x}\}$ $=$ $\int_{-\infty}^{\infty} f(x)e^{j\omega x} dx$

and can be derived from the moment generating function $\Phi_{x}(s)$.
 $\Phi_{x}(s)$ $=$ $E\{e^{js}\}$ $=$ $\int_{-\infty}^{\infty} f(x)e^{s x} dx$

The moment generating function of the $\chi^{2}$ distribution is given by [3, p.117, (5.71)]: .
 $\Phi_{u}(s) = \frac{1}{(1-2s)^{N/2}}$ (2)

And by replacing $s$ with $j\omega$ we obtain from (2) the characteristic function of the pdf of the chi square distribution:
 $\Phi_{\chi^{2}}(\omega) = \frac{1}{(1-2j\omega)^{N/2}}$ (3)

Now let us proceed to obtain the characteristic function of the random variable $y$ which we define to be that random variable that is obtained by the sum of squares of $N$ independent and identically distributed N(0,1) random variables by using gaussian distributons. The variable $y$ is given by:
 $y(\mathbf{x})= \sum\limits_{i=0}^{N-1}x_{i}^{2} = \sum\limits_{i=0}^{N-1}y_{i}$ (4)

with $\mathbf{x} = \left[ \begin{array}{ccc}x_{0} &,..., & x_{N-1}\end{array} \right]^{T}$. In order to compute the characteristic function of $y_{i}=x^{2}_{i}$ we will need the following theorem from [3, (5.32), p. 106]:
 $E\{y=g(x)\}$ $=$ $\int_{-\infty}^{\infty}yf_{y}(y)dy$ $=$ $\int_{-\infty}^{\infty}g(x)f_{x}(x)dx$ (5)

The characteristic function of $\Phi_{y_{i}}(j \omega)=E\{e^{j \omega y_{i}}\}$ can thus be computed by (5).
 $\Phi_{y_{i}}(\omega)$ $=$ $E\{e^{j\omega y_{i}}\}$ $=$ $E\{e^{j\omega x^{2}_{i}}\}$ $=$ $\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2} \cdot x_{i}^{2}}e^{j\omega x^{2}_{i}} dx_{i}$ $=$ $\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}e^{ (j\omega -\frac{1}{2}) \cdot x_{i}^{2} } dx_{i}$ $=$ $\frac{1}{\sqrt{2\pi}} \frac{\sqrt{\pi}}{\sqrt{\frac{1}{2}-j\omega}}$ $=$ $\frac{1}{\sqrt{1-2j\omega}}$ $=$ $\frac{1}{\left(1-2j\omega\right)^{\frac{1}{2}}}$ (6)

The integral $\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}e^{ (j\omega -\frac{1}{2}) \cdot x_{i}^{2} } dx_{i}$ was evaluated using the integral tables of [4, p. 992]. We note that the joint probability density function $f(\mathbf{x})$ of $\mathbf{x}$ is given by the product of the individual pdf’s of the normal distributed random variables $x_{i}$, with $f_{i}(x_{i})=\frac{1}{\sqrt{2\pi}}e^{\frac{-x_{i}^{2}}{2}}$ because the $N$ random variables are independent:
 $f(\mathbf{x}) = \prod_{i=0}^{N-1}f_{i}(x_{i})$

Thus the characteristic function of $y=\sum_{i=0}^{N-1}x_{i}^{2}$ is given by [3, p. 158]:
 $\Phi_{y}(\omega) = \prod_{i=0}^{N-1}\Phi_{x^{2}_{i}}(\omega)$ (7)

which by using (6) is equal to:
 $\Phi_{y}(\mathbf{\omega})$ $=$ $\prod_{i=0}^{N-1} \frac{1}{\left(1-2j\omega\right)^{\frac{1}{2}}}$ $=$ $\frac{1}{\left(1-2j\omega\right)^{\frac{N}{2}}}$ (8)

By comparing the equation (3) with (8) we see that the same characteristic function is obtained by either using the probability density function as given by (1) or by computing the characteristic function using the squared sum of $N(0,1)$ gaussian distributed random variables in conjunction with theorem (5). By virtue of the uniqueness theorem of Fourier transform pairs – the characteristic functions of a pdf is also a Fourier transform – we obtain the result that the probability density function of the squared sum of $N(0,1)$ gaussian distributed random variables is equal to (1). QED.

[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X.
[2] Fahrmeir and Künstler and Pigeot and Tutz: “Statistik”, Springer.
[3] Papoulis, Athanasios: “Probability, Random Variables, and Stochastic Processes”, McGraw-Hill.
[4] Bronstein and Semdjajew and Musiol and Muehlig: “Taschenbuch der Mathematik”, Verlag Harri Deutsch Thun und Frankfurt am Main, ISBN: 3-8171-2003-6.