Estimation Theory

Cramer Rao Lower Bound (CRLB) for Vector Parameter Estimation

CRLB for scalar parameter estimation was discussed in previous posts. The same concept is extended to vector parameter estimation. Consider a set of deterministic parameters \(\mathbb{\theta}=^{T} \) that we wish to estimate. The estimate is denoted in vector form as,\(\mathbb{\hat{\theta}} = [ \hat{\theta_1}, \hat{\theta_2}, ...,...
Continue Reading »
Estimation Theory

Normalized CRLB – an alternate form of CRLB and its relation to estimator sensitivity

The variance of an estimate is always greater than or equal to Cramer Rao Lower Bound of the estimate. The CRLB is in turn given by inverse of Fisher Information.The following equation concisely summarizes the above point. The Fisher Information can be re-written as Thus the variance of the...
Continue Reading »
Estimation Theory

Cramer Rao Lower Bound for Phase Estimation

Consider the DSB carrier frequency estimation problem given in the introductory chapter to estimation theory. A message is sent across a channel modulated by a sinusoidal carrier with carrier frequency = fc and amplitude= ‘A’. The transmitted signal gets affected by zero-mean AWGN noise when it travels across the...
Continue Reading »
Estimation Theory

Applying Cramer Rao Lower Bound (CRLB) to find a Minimum Variance Unbiased Estimator (MVUE)

It was mentioned in one of the earlier articles that CRLB may provide a way to find a MVUE (Minimum Variance Unbiased Estimators). Theorem: There exists an unbiased estimator that attains CRLB if and only if, Here is the log likelihood function of x parameterized by – the parameter...
Continue Reading »
Estimation Theory

Cramer Rao Lower Bound for Scalar Parameter Estimation

Consider a set of observed data samples and is the scalar parameter that is to be estimated from the observed samples. The accuracy of the estimate depends on how well the observed data is influenced by the parameter . The observed data is considered as a random data whose...
Continue Reading »

Introduction to Cramer Rao Lower Bound (CRLB)

The criteria for existence of having an Minimum Variance Unbiased Estimator (MVUE) was discussed in a previous article. To have an MVUE, it is necessary to have estimates that are unbiased and that give minimum variance (compared to the true parameter value). This is given by the following two...
Continue Reading »
Estimation Theory

Minimum Variance Unbiased Estimators (MVUE)

As discussed in the introduction to estimation theory, the goal of an estimation algorithm is to give an estimate of random variable(s) that is unbiased and has minimum variance. This criteria is reproduced here for reference $$ E\left\{\hat{f}_0 \right\} = f_0 $$ $$ \sigma^{2}_{\hat{f}_0}=E\left\{(\hat{f}_0 – E)^2 \right\} $$ In...
Continue Reading »