# Lysario – by Panagiotis Chatzichrisafis

"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

## Archive for the ‘Kay: Modern Spectral Estimation, Theory and Application’ Category

In [1, p. 61 exercise 3.6] we are asked to assume that the variance is to be estimated as well as the mean for the conditions of [1, p. 60 exercise 3.4] (see also [2, solution of exercise 3.4]) . We are asked to prove for the vector parameter $\mathbf{\theta}=\left[\mu_x \; \sigma^2_x\right]^T$, that the Fisher information matrix is
 $\mathbf{I}_{\theta}=\left[\begin{array}{cc} \frac{N}{\sigma^2_x} & 0 \\ 0 & \frac{N}{2\sigma^4_x} \end{array}\right]$

Furthermore we are asked to find the CR bound and to determine if the sample mean $\hat{\mu}_x$ is efficient. If additionaly the variance is to be estimated as
 $\hat{\sigma}^2_x=\frac{1}{N-1}\sum\limits_{n=0}^{N-1}(x[n]-\hat{\mu}_x)^2$

then we are asked to determine if this estimator is unbiased and efficient. Hint: We are instructed to use the result that
 $\frac{(N-1)\hat{\sigma}^2_x}{\sigma^2_x} \sim \chi^2_{N-1}$

 $var(\hat{\mu}_x) \geq \frac{\sigma_x^2}{N}$
 $\hat{\mu}_x = \frac{1}{N}\sum\limits^{N-1}_{n=0}x[n]$ (1)
is an unbiased estimator, given $\left\{x[0],x[1],...,x[N]\right\}$ are independent and identically distributed according to a $N(\mu_x,\sigma^{2}_x)$ distribution. Furthermore we are asked to also find the variance of the estimator. read the conclusion >
In [1, p. 60 exercise 3.2] we are asked to proof by using the method of characteristic functions that the sum of squares of N independent and identically distributed N(0,1) random variables has a $\chi^{2}_{N}$ distribution. read the conclusion >