Theoretical derivation of Maximum Likelihood Estimator for Poisson PDF:

PoorBelow averageAverageGoodExcellent (1 votes, average: 5.00 out of 5)

Suppose X=(x1,x2,…, xN) are the samples taken from a random distribution whose PDF is parameterized by the parameter \theta . If the PDF of the underlying parameter satisfies some regularity condition (if the log of the PDF is differentiable) then the likelihood function is given by

Here f_N(x_N;\theta) is the PDF of the underlying distribution.

Hereafter we will denote L(\theta;X) as L(\theta) .

The maximum likelihood estimate of the unknown parameter \theta can be found by selecting the \theta say some \theta^* for which the likelihood function attains maximum. We usually use log of the likelihood function to simplify multiplications into additions. So restating this, the maximum likelihood estimate of the unknown parameter \theta can be found by selecting the \theta say some \theta^* for which the log likelihood function attains maximum.

In differential geometry, the maximum of a function f(x) is found by taking the first derivative of the function and equating it to zero. Similarly, the maximum likelihood estimate of a parameter – \theta^* is found by partially differentiating the likelihood function L(\theta) or the log likelihood function ln L(\theta) and equating it to zero.

The first partial derivative of log likelihood function with respect to \theta is also called score. The variance of the score (partial derivative of score with respect to \theta ) is known as Fisher Information.

Calculating MLE for Poisson distribution:

Let X=(x1,x2,…, xN) are the samples taken from Poisson distribution given by

Calculating the Likelihood

The log likelihood is given by,

Differentiating and equating to zero to find the maxim (otherwise equating the score to zero)

Thus the mean of the samples gives the MLE of the parameter \theta .

To be updated soon

For the derivation of other PDFs see the following links
Theoretical derivation of Maximum Likelihood Estimator for Exponential PDF
Theoretical derivation of Maximum Likelihood Estimator for Gaussian PDF

See also:

[1] An Introduction to Estimation Theory
[2] Bias of an Estimator
[3] Minimum Variance Unbiased Estimators (MVUE)
[4] Maximum Likelihood Estimation
[5] Maximum Likelihood Decoding
[6] Probability and Random Process
[7] Likelihood Function and Maximum Likelihood Estimation (MLE)

Books on Estimation Theory:

More Recommended Books at our Integrated Book Store

Post your valuable comments !!!