"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

25 Apr

In [1, p. 61 exercise 3.7] we are asked to find the MLE of and .
for the conditions of Problem [1, p. 60 exercise 3.4] (see also [2, solution of exercise 3.4]).
We are asked if the MLE of the parameters are asymptotically unbiased , efficient and Gaussianly distributed.

**Solution:**

The p.d.f of the observations is given by

With and . Thus the determinant is given by . Furthermore we can simplify

For a given measurement we obtain the likelihood function:

Obviously the previous equation is positive and thus the estimator of the mean which will maximize the probability of the observation will provide a local maximum at the points where the derivative in respect to will be zero. Because the function is positive we can also use the natural logarithm of the likelihood function (the log-likelihood function) in order to obtain the maximum likelihood of . This was already derived in [3, relation (3)]:

From the previous relation we derive the maximum likelihood estimator of :

With the same reasoning we may obtain the maximum likelihood estimator of the variance :

Again solving for will give us the maximum likelihood estimator of the variance of the random process:

The mean of the maximum likelihood estimator of the variance can be easily obtained using [4, relation (8)] and noting that the maximum likelihood estimator of the variance is times the variance estimator that was used in [1, p. 61 exercise 3.6] (see also [4, solution of exercise 3.6]):

The variance of the maximum likelihood estimator of the variance can be obtained by analogy to [4] by noting that

From the previous relation we can obtain the variance of the maximum likelihood variance estimator by:

By using the results from this exercise and [2], [4] we can summarize the properties of the MLE estimators in the following table:

From the previous table it is evident that for large the mean values of the estimators match the true mean values. The same is true for the variances that asymptotically tend to zero – the same limit the CR – bound attains for . The MLE estimator of the mean is Gaussianly distributed even for small , while by the central limit theorem [5, p. 622] the MLE estimator of the variance () is also asymptotically distributed Gaussianly. So the answer to the question if the parameters are asymptotically unbiased, efficient and Gaussianly distributed is yes. QED.

[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X.

[2] Chatzichrisafis: “Solution of exercise 3.4 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[3] Chatzichrisafis: “Solution of exercise 3.5 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[4] Chatzichrisafis: “Solution of exercise 3.6 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[5] Granino A. Korn and Theresa M. Korn: “Mathematical Handbook for Scientists and Engineers”, Dover, ISBN: 978-0-486-41147-7.

The p.d.f of the observations is given by

With and . Thus the determinant is given by . Furthermore we can simplify

For a given measurement we obtain the likelihood function:

(1) |

Obviously the previous equation is positive and thus the estimator of the mean which will maximize the probability of the observation will provide a local maximum at the points where the derivative in respect to will be zero. Because the function is positive we can also use the natural logarithm of the likelihood function (the log-likelihood function) in order to obtain the maximum likelihood of . This was already derived in [3, relation (3)]:

From the previous relation we derive the maximum likelihood estimator of :

(2) |

With the same reasoning we may obtain the maximum likelihood estimator of the variance :

Again solving for will give us the maximum likelihood estimator of the variance of the random process:

(3) |

The mean of the maximum likelihood estimator of the variance can be easily obtained using [4, relation (8)] and noting that the maximum likelihood estimator of the variance is times the variance estimator that was used in [1, p. 61 exercise 3.6] (see also [4, solution of exercise 3.6]):

(4) |

The variance of the maximum likelihood estimator of the variance can be obtained by analogy to [4] by noting that

From the previous relation we can obtain the variance of the maximum likelihood variance estimator by:

(5) |

By using the results from this exercise and [2], [4] we can summarize the properties of the MLE estimators in the following table:

From the previous table it is evident that for large the mean values of the estimators match the true mean values. The same is true for the variances that asymptotically tend to zero – the same limit the CR – bound attains for . The MLE estimator of the mean is Gaussianly distributed even for small , while by the central limit theorem [5, p. 622] the MLE estimator of the variance () is also asymptotically distributed Gaussianly. So the answer to the question if the parameters are asymptotically unbiased, efficient and Gaussianly distributed is yes. QED.

[1] Steven M. Kay: “Modern Spectral Estimation – Theory and Applications”, Prentice Hall, ISBN: 0-13-598582-X.

[2] Chatzichrisafis: “Solution of exercise 3.4 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[3] Chatzichrisafis: “Solution of exercise 3.5 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[4] Chatzichrisafis: “Solution of exercise 3.6 from Kay’s Modern Spectral Estimation - Theory and Applications”.

[5] Granino A. Korn and Theresa M. Korn: “Mathematical Handbook for Scientists and Engineers”, Dover, ISBN: 978-0-486-41147-7.

## Leave a reply