# Maximum Likelihood estimation

In reality, a communication channel can be quite complex and a model becomes necessary to simplify calculations at decoder side.The model should closely approximate the complex communication channel. There exist a myriad of standard statistical models that can be employed for this task; Gaussian, Binomial, Exponential, Geometric, Poisson,etc., A standard communication model is chosen based on empirical data.

Each model mentioned above has unique parameters that characterizes them. Determination of these parameters for the chosen model is necessary to make them closely model the communication channel at hand.

Suppose a binomial model is chosen (based on observation of data) for the error events over a particular channel, it is essential to determine the probability (p) of the binomial model.

If a Gaussian model (normal distribution!!!) is chosen for a particular channel then estimating $latex \mu$ (mean) and $latex \sigma^{2}$ (variance) are necessary so that they can be applied while computing the conditional probability of p(y received | x sent)

Similarly estimating lambda is a necessity for a Poisson distribution model.

Maximum likelihood estimation is a method to determine these unknown parameters associated with the corresponding chosen models of the communication channel.

### An Example for MLE :

The following data is presented based on the observations over a Binary Symmetric Channel (BSC) (p=0.5) modeled as a Binomially distributed model.

90 codewords (each of them 10 bits wide) are transmitted over the BSC channel, of which 410 bits in total are received erroneously.Lets estimate the actual probability of success for this model.

The BSC channel is assumed to have p=0.5, meaning that the probability of error for ‘0’ and ‘1’ are equal and the probability of success is also 0.5.

### Matlab code:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
% Maximum likelihood Estimation d=410; %Number of bits in error n=90*10; %Total number of bits sent k=n-d; %Number of Bits NOT in error q=0:0.002:1; %range of success probability to test likelihood y=binopdf(k,n,q); % assuming binomial distribution plot(q,y); xlabel('Probaility q'); ylabel('Likelihood'); title('Maximum Likelihood Estimation'); [maxY,maxIndex]=max(y); % Finding the Max and its index disp(sprintf('MLE of q is %f',q(maxIndex))) %print the probability corresponding to the max(y) |

### Output :

MLE of q is 0.544000

Therefore the probability of success for this BSC is q= 0.544 whereas the probability of error is p=1-q=0.456

### See also:

[1] An Introduction to Estimation Theory[2] Maximum Likelihood Decoding

[3] Probability and Random Process

Pingback: Minimum Variance Unbiased Estimators (MVUE) | GaussianWaves()

Pingback: Bias of an Estimator | GaussianWaves()

Pingback: Likelihood Function and Maximum Likelihood Estimation (MLE) | GaussianWaves()

Pingback: Theoretical derivation of Maximum Likelihood Estimator for Poisson PDF: | GaussianWaves()

Pingback: Introduction to Cramer Rao Lower Bound (CRLB) | GaussianWaves()

Pingback: Score, Fisher Information and Estimator Sensitivity | GaussianWaves()

Pingback: Theoretical derivation of MLE for Exponential Distribution: | GaussianWaves()

Pingback: Maximum Likelihood Decoding – GaussianWaves()