Normalized CRLB – an alternate form of CRLB

Key focus: Normalized CRLB (Cramér-Rao Lower bound) is an alternate form of CRLB. Let’s explore how normalized CRLB is related to estimator sensitivity.

The variance of an estimate is always greater than or equal to Cramér-Rao Lower Bound of the estimate. The CRLB is in turn given by inverse of Fisher Information. The following equation concisely summarizes the above point.

CRLB equation 1CRLB equation 2

The Fisher Information can be re-written as
CRLB equation 3

Thus the variance of the estimate can be written as
CRLB equation 4

Consider an incremental change in \theta, that is, \theta \rightarrow \theta + \Delta \theta . This causes the PDF to change from p(\mathbf{x},\theta) \rightarrow p(\mathbf{x},\theta + \Delta \theta). We wish to answer the following question : How sensitive is p(\mathbf{x},\theta) to that change ? Sensitivity (denoted by {\tilde{S}_\theta}^P(\mathbf{x}) ) is given by the ratio of change in p(\mathbf{x},\theta) to the change in \theta .

CRLB equation 5

Letting \Delta \theta \rightarrow 0

CRLB equation 8

From Calculus,

CRLB equation 6

Thus the sensitivity is given by,
CRLB equation 9

The variance of the estimate can now be put in the following form.
CRLB equation 10

The above expression is the normalized version of CRLB. It can be interpreted that the normalized CRLB is equal to the inverse of mean square sensitivity.

Rate this article: PoorBelow averageAverageGoodExcellent (1 votes, average: 5.00 out of 5)

Similar topics:

[1]An Introduction to Estimation Theory
[2]Bias of an Estimator
[3]Minimum Variance Unbiased Estimators (MVUE)
[4]Maximum Likelihood Estimation
[5]Maximum Likelihood Decoding
[6]Probability and Random Process
[7]Likelihood Function and Maximum Likelihood Estimation (MLE)
[8]Score, Fisher Information and Estimator Sensitivity
[9]Introduction to Cramer Rao Lower Bound (CRLB)
[10]Cramer Rao Lower Bound for Scalar Parameter Estimation
[11]Applying Cramer Rao Lower Bound (CRLB) to find a Minimum Variance Unbiased Estimator (MVUE)
[12]Efficient Estimators and CRLB
[13]Cramer Rao Lower Bound for Phase Estimation
[14]Normalized CRLB - an alternate form of CRLB and its relation to estimator sensitivity
[15]Cramer Rao Lower Bound (CRLB) for Vector Parameter Estimation
[16]The Mean Square Error – Why do we use it for estimation problems
[17]How to estimate unknown parameters using Ordinary Least Squares (OLS)
[18]Essential Preliminary Matrix Algebra for Signal Processing
[19]Why Cholesky Decomposition ? A sample case:
[20]Tests for Positive Definiteness of a Matrix
[21]Solving a Triangular Matrix using Forward & Backward Substitution
[22]Cholesky Factorization - Matlab and Python
[23]LTI system models for random signals – AR, MA and ARMA models
[24]Comparing AR and ARMA model - minimization of squared error
[25]Yule Walker Estimation
[26]AutoCorrelation (Correlogram) and persistence – Time series analysis
[27]Linear Models - Least Squares Estimator (LSE)
[28]Best Linear Unbiased Estimator (BLUE)

Books by the author

Wireless Communication Systems in Matlab
Wireless Communication Systems in Matlab
Second Edition(PDF)

PoorBelow averageAverageGoodExcellent (168 votes, average: 3.71 out of 5)

Digital modulations using Python
Digital Modulations using Python
(PDF ebook)

PoorBelow averageAverageGoodExcellent (124 votes, average: 3.60 out of 5)

digital_modulations_using_matlab_book_cover
Digital Modulations using Matlab
(PDF ebook)

PoorBelow averageAverageGoodExcellent (129 votes, average: 3.71 out of 5)

Hand-picked Best books on Communication Engineering
Best books on Signal Processing

Post your valuable comments !!!