Author: Panagiotis
23
Jan
The desire to predict the complex WSS random process based on the sample
by using a linear predictor
is expressed in
[1, p. 61 exercise 3.10].
It is asked to chose
to minimize the MSE or prediction error power
We are asked to find the optimal prediction parameter
and the minimum prediction error power by using the orthogonality principle.
read the conclusion >
In
[1, p. 61 exercise 3.9] we are asked to consider the real linear model
and find the MLE of the slope
and the intercept
by assuming that
is real white Gaussian noise with mean zero and variance
.
Furthermore it is requested to find the MLE of
if in the linear model we set
.
read the conclusion >
Author: Panagiotis
22
Sep
In
[1, p. 61 exercise 3.8] we are asked to prove that the sample mean is a sufficient statistic for the mean under the conditions of
[1, p. 61 exercise 3.4].
Assuming that
is known. We are asked to find the MLE of the mean by maximizing
.
read the conclusion >
Author: Panagiotis
25
Apr
In
[1, p. 61 exercise 3.7] we are asked to find the MLE of
and
.
for the conditions of Problem
[1, p. 60 exercise 3.4] (see also
[2, solution of exercise 3.4]).
We are asked if the MLE of the parameters are asymptotically unbiased , efficient and Gaussianly distributed.
read the conclusion >
Author: Panagiotis
19
Apr
In
[1, p. 61 exercise 3.6] we are asked to assume that the variance is to be estimated as well as the mean for the conditions of
[1, p. 60 exercise 3.4] (see also
[2, solution of exercise 3.4]) . We are asked to prove for the vector parameter
, that the Fisher information matrix is
Furthermore we are asked to find the CR bound and to determine if the sample mean
is efficient.
If additionaly the variance is to be estimated as
then we are asked to determine if this estimator is unbiased and efficient. Hint: We are instructed to use the result that
read the conclusion >