Applying Cramer Rao Lower Bound (CRLB) to find a Minimum Variance Unbiased Estimator (MVUE)

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 3.00 out of 5)

Loading...
It was mentioned in one of the earlier articles that CRLB may provide a way to find a MVUE (Minimum Variance Unbiased Estimators).

Theorem:

There exists an unbiased estimator that attains CRLB if and only if,

Cramer Rao Lower Bound Equation

Here \( ln \; L(\mathbf{x};\theta) \) is the log likelihood function of x parameterized by \(\theta\) – the parameter to be estimated, \( I(\theta)\) is the Fisher Information and \( g(x)\) is some function.

Then, the estimator that attains CRLB is given by

Cramer Rao Lower Bound Equation

Steps to find MVUE using CRLB:

If we could write the equation (as given above) in terms of Fisher Matrix and some function \( g(x)\) then \(g(x)\) is a Minimum Variable Unbiased Estimator.
1) Given a signal model \( x \), compute \(\frac{\partial\;ln\;L(\mathbf{x};\theta) }{\partial \theta }\)
2) Check if the above computation can be put in the form like the one given in the above theorem
3) Then \(g(\mathbf{x})\) given an MVUE

Let’s look at how CRLB can be used to find an MVUE for a signal that has a DC component embedded in AWGN noise.

Finding a MVUE to estimate DC component embedded in noise:

Consider the signal model where a DC component – \(A\) is embedded in an AWGN noise with zero mean and variance=\(\sigma \).
Our goal is to find an MVUE that could estimate the DC component from the observed samples \(x[n]\).

$$x[n] = A + w[n], \;\;\; n=0,1,2,\cdots,N-1 $$

We calculate CRLB and see if it can help us find a MVUE.

From the previous derivation

Cramer Rao Lower Bound Equation

From the above equation we can readily identify \( I(A)\) and \(g(\mathbf{x})\) as follows

Cramer Rao Lower Bound Equation

Thus,the Fisher Information \(I(A)\) and the MVUE \(g(\mathbf{x})\) are given by

Cramer Rao Lower Bound Equation

Thus for a signal model which has a DC component in AWGN, the sample mean of observed samples \(x[n]\) gives a Minimum Variance Unbiased Estimator to estimate the DC component.

See also:

[1]An Introduction to Estimation Theory
[2]Bias of an Estimator
[3]Minimum Variance Unbiased Estimators (MVUE)
[4]Maximum Likelihood Estimation
[5]Maximum Likelihood Decoding
[6]Probability and Random Process
[7]Likelihood Function and Maximum Likelihood Estimation (MLE)
[8]Score, Fisher Information and Estimator Sensitivity
[9]Introduction to Cramer Rao Lower Bound (CRLB)
[10]Cramer Rao Lower Bound for Scalar Parameter Estimation
[11]Applying Cramer Rao Lower Bound (CRLB) to find a Minimum Variance Unbiased Estimator (MVUE)
[12]Efficient Estimators and CRLB
[13]Cramer Rao Lower Bound for Phase Estimation
[14]Normalized CRLB - an alternate form of CRLB and its relation to estimator sensitivity
[15]Cramer Rao Lower Bound (CRLB) for Vector Parameter Estimation
[16]The Mean Square Error – Why do we use it for estimation problems
[17]How to estimate unknown parameters using Ordinary Least Squares (OLS)
[18]Essential Preliminary Matrix Algebra for Signal Processing
[19]Why Cholesky Decomposition ? A sample case:
[20]Tests for Positive Definiteness of a Matrix
[21]Solving a Triangular Matrix using Forward & Backward Substitution
[22]Cholesky Factorization and Matlab code
[23]LTI system models for random signals – AR, MA and ARMA models
[24]Comparing AR and ARMA model - minimization of squared error
[25]Yule Walker Estimation
[26]AutoCorrelation (Correlogram) and persistence – Time series analysis
[27]Linear Models - Least Squares Estimator (LSE)
[28]Best Linear Unbiased Estimator (BLUE)