# Lysario – by Panagiotis Chatzichrisafis

"ούτω γάρ ειδέναι το σύνθετον υπολαμβάνομεν, όταν ειδώμεν εκ τίνων και πόσων εστίν …"

## Archive for Januar, 2012

In [1, p. 61 exercise 3.11] it is asked to repeat problem [1, p. 61 exercise 3.10] (see also the solution [2] ) for the general case when the predictor is given as
 $\hat{x}[n]=-\sum\limits_{k=1}^{p}\alpha_{k}x[n-k].$ (1)

Furthermore we are asked to show that the optimal prediction coefficients $\{\alpha_{1},\alpha_{2},..., \alpha_{p}\}$ are found by solving [1, p. 157, eq. 6.4 ] and the minimum prediction error power is given by [1, p. 157, eq. 6.5 ]. read the conclusion >
The desire to predict the complex WSS random process based on the sample $x[n-1]$ by using a linear predictor
 $\hat{x}[n]=-\alpha_{1}x[n-1]$ (1)

is expressed in [1, p. 61 exercise 3.10]. It is asked to chose $\alpha_{1}$ to minimize the MSE or prediction error power
 $MSE = \mathcal{E}\left\{\left| x[n] -\hat{x}[n] \right|^{2} \right\}.$ (2)

We are asked to find the optimal prediction parameter $\alpha_{1}$ and the minimum prediction error power by using the orthogonality principle.
 $x[n]=\alpha + \beta n + z[n] \; n=0,1,...,N-1$ (1)
and find the MLE of the slope $\beta$ and the intercept $\alpha$ by assuming that $z[n]$ is real white Gaussian noise with mean zero and variance $\sigma_{z}^{2}$. Furthermore it is requested to find the MLE of $\alpha$ if in the linear model we set $\beta=0$. read the conclusion >