"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

19 Apr

In [1, p. 61 exercise 3.6] we are asked to assume that the variance is to be estimated as well as the mean for the conditions of [1, p. 60 exercise 3.4] (see also [2, solution of exercise 3.4]) . We are asked to prove for the vector parameter , that the Fisher information matrix is

Furthermore we are asked to find the CR bound and to determine if the sample mean is efficient. If additionaly the variance is to be estimated as

then we are asked to determine if this estimator is unbiased and efficient. Hint: We are instructed to use the result that

**Solution:**
We have already obtained the joint pdf of the independent samples with normal distribution and the natural logarithm of the joint pdf is given by from [3, relation (3) ]:

From this relation we can find the gradient in respect to the vector parameter:

Thus Fisher’s information matrix is given by [1, p. 47, (3.22) ]:

Considering that the samples are independent we obtain the individual elements of the matrix are given by:

We note that is an odd function about (that is ) while the gaussian distribution is an even function about . Thus the mean of this function -the integral which is symmetric about and extends from to will be equal to zero. This fact can be shown by the following approach:

Let in the last formula

If we set in the second integral, we obtain the following formulas:

Using [2] in conjunction with [3] we obtain

Finally it remains to obtain the value for :

We note that in (5) is a normalized random gaussian variable and that the squared sum of such variables has a chi-square distribution [4, p. 682] with N degrees of freedom with mean and variance equal to . Thus

From (1), (4) and (6) we obtain finally that Fisher’s information matrix is equal to:

We have shown in [2, solution of exercise 3.4] that the mean and the variance [2, solution of exercise 3.4, relation (5) ] of the estimator are given by:

The mean and the variance of the estimator are given by (always considering the independence of the random variables for ):

From the previous relation we obtain finally:

Considering that we can also obtain the variance of the estimator :

Because the Cramer Rao bound for the estimator is given by the inverse of the diagonal element of Fisher’s information matrix (6) we obtain using (9) :

thus the estimator is unbiased because of (8) but not efficient because it doesn’t attain the Cramer – Rao bound given by (10). QED.

[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X.

[2] Chatzichrisafis: “Solution of exercise 3.4 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[3] Chatzichrisafis: “Solution of exercise 3.5 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[4] Granino A. Korn and Theresa M. Korn: “Mathematical Handbook for Scientists and Engineers”, Dover, ISBN: 978-0-486-41147-7.

Furthermore we are asked to find the CR bound and to determine if the sample mean is efficient. If additionaly the variance is to be estimated as

then we are asked to determine if this estimator is unbiased and efficient. Hint: We are instructed to use the result that

From this relation we can find the gradient in respect to the vector parameter:

Thus Fisher’s information matrix is given by [1, p. 47, (3.22) ]:

Considering that the samples are independent we obtain the individual elements of the matrix are given by:

(1) | |||

(2) |

We note that is an odd function about (that is ) while the gaussian distribution is an even function about . Thus the mean of this function -the integral which is symmetric about and extends from to will be equal to zero. This fact can be shown by the following approach:

Let in the last formula

If we set in the second integral, we obtain the following formulas:

(3) |

Using [2] in conjunction with [3] we obtain

(4) |

Finally it remains to obtain the value for :

(5) |

We note that in (5) is a normalized random gaussian variable and that the squared sum of such variables has a chi-square distribution [4, p. 682] with N degrees of freedom with mean and variance equal to . Thus

(6) |

From (1), (4) and (6) we obtain finally that Fisher’s information matrix is equal to:

(7) |

We have shown in [2, solution of exercise 3.4] that the mean and the variance [2, solution of exercise 3.4, relation (5) ] of the estimator are given by:

The mean and the variance of the estimator are given by (always considering the independence of the random variables for ):

From the previous relation we obtain finally:

(8) |

Considering that we can also obtain the variance of the estimator :

(9) |

Because the Cramer Rao bound for the estimator is given by the inverse of the diagonal element of Fisher’s information matrix (6) we obtain using (9) :

(10) |

thus the estimator is unbiased because of (8) but not efficient because it doesn’t attain the Cramer – Rao bound given by (10). QED.

[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X.

[2] Chatzichrisafis: “Solution of exercise 3.4 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[3] Chatzichrisafis: “Solution of exercise 3.5 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[4] Granino A. Korn and Theresa M. Korn: “Mathematical Handbook for Scientists and Engineers”, Dover, ISBN: 978-0-486-41147-7.

## One Response for "Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”,p. 61 exercise 3.6"

[...] The mean of the maximum likelihood estimator of the variance can be easily obtained using [4, relation (8)] and noting that the maximum likelihood estimator of the variance is times the variance estimator [...]

## Leave a reply