Introduction to Cramer Rao Lower Bound (CRLB)
The criteria for existence of having an Minimum Variance Unbiased Estimator (MVUE) was discussed in a previous article. To have an MVUE, it is necessary to have estimates that are unbiased and that give minimum variance (compared to the true parameter value). This is given by the following two equations
$$ E\left\{\hat{f}_0 \right\} = f_0 $$
$$ \sigma^{2}_{\hat{f}_0}=E\left\{(\hat{f}_0 – E[\hat{f}_0])^2 \right\} \; should \; be \; minimum $$
For a MVUE, it is easier to verify the first criteria (unbiasedness) using the first equation, but verifying the second criteria (minimum variance) is tricky. We can only calculate the variance of the estimator, but how can we make sure that it is “the minimum”? How can we make sure that an estimator that is designed gives the minimum variance? There may exist other numerous unbiased estimators (which we may not know) that may give minimum variance. Other words, how do we make sure that our estimate is the best MVUE in the world? CramerRao Lower Bound (CRLB) may come to our rescue.
CramerRao Lower Bound (CRLB):
Harald Cramer and Radhakrishna Rao derived a way to express the lower bound on the variance of unbiased estimators that estimate deterministic parameters. This lower bound is called as the CramerRao Lower Bound (CRLB).
If \(\hat{\theta} \) is an unbiased estimate of a deterministic parameter \(\theta \), then the relationship between the variance of the estimates ( \( {{\sigma}^2}_{\hat{\theta}}\)) and CRLB can be expressed as
$$ {{\sigma}^2}_{\hat{\theta}} \left( \theta \right ) \geq CRLB \left( \theta \right) \Rightarrow {\sigma}_{\hat{\theta}} \left( \theta \right ) \geq \sqrt{CRLB \left( \theta \right)} $$
CRLB tell us the best minimum variance that we can expect to get from an unbiased estimator.
Applications of CRLB include :
1) Making judgment on proposed estimators. Estimators whose variance is not close to CRLB are considered inferior.
2) To do feasibility studies as to whether a particular estimator/system can meet given specifications. It is also used to rule out impossible estimators – No estimator can beat CRLB.
3) Benchmark for comparing unbiased estimators
4) It may sometimes provide MVUE. If an unbiased estimator achieved CRLB, it means that it is a MVUE.
Feasibility Studies :
Derivation of CRLB for a particular given scenario or proposed algorithm of estimation is often found in research texts. The derived theoretical CRLB for a system/algorithm is compared with actual variance of the implemented system and conclusions are drawn. For example, in the paper titled “A Novel Frequency Synchronization Algorithm and its Cramer Rao Bound in Practical UWB Environment for MBOFDM Systems”^{[1]} – a frequency offset estimation algorithm was proposed for estimating frequency offsets in multiband orthogonal frequency division multiplexing (MBOFDM) systems. The performance of the algorithm was studied by BER analysis (Eb/N0 Vs BER curves). Additionally,the estimator performance is further validated by comparing the simulated estimator variance with the derived theoretical CRLB for four UWB channel models.
Reference
See also:
[1] An Introduction to Estimation Theory
[2] Bias of an Estimator
[3] Minimum Variance Unbiased Estimators (MVUE)
[4] Maximum Likelihood Estimation
[5] Maximum Likelihood Decoding
[6] Probability and Random Process
[7] Likelihood Function and Maximum Likelihood Estimation (MLE)
[8] Score, Fisher Information and Estimator Sensitivity

Nate

http://www.gaussianwaves.com/ Mathuranathan
