The Mean Square Error – Why do we use it for estimation problems

“Mean Square Error”, abbreviated as MSE, is an ubiquitous term found in texts on estimation theory. Have you ever wondered what this term actually means and why is this getting used in estimation theory very often ? Any communication system has a transmitter, a channel or medium to communicate and a receiver. Given the channel … Read more

Normalized CRLB – an alternate form of CRLB

Key focus: Normalized CRLB (Cramér-Rao Lower bound) is an alternate form of CRLB. Let’s explore how normalized CRLB is related to estimator sensitivity. The variance of an estimate is always greater than or equal to Cramér-Rao Lower Bound of the estimate. The CRLB is in turn given by inverse of Fisher Information. The following equation … Read more

Cramer Rao Lower Bound for Phase Estimation

Key focus: Derive the Cramer-Rao lower bound for phase estimation applied to DSB transmission. Find out if an efficient estimator actually exists for phase estimation. Problem formulation Consider the DSB carrier frequency estimation problem given in the introductory chapter to estimation theory. A message is sent across a channel modulated by a sinusoidal carrier with … Read more

Efficient Estimators by applying CRLB

It has been reiterated that not all estimators are efficient. Even not all the Minimum Variance Unbiased Estimators (MVUE) are efficient. Then how do we quantify whether the estimator designed by us is efficient or not? An efficient estimator is defined as the one that is* Unbiased (mean of the estimate = true value of … Read more

Applying Cramer Rao Lower Bound (CRLB) to find a Minimum Variance Unbiased Estimator (MVUE)

It was mentioned in one of the earlier articles that CRLB may provide a way to find a MVUE (Minimum Variance Unbiased Estimators). Theorem: There exists an unbiased estimator that attains CRLB if and only if, Here \( ln \; L(\mathbf{x};\theta) \) is the log likelihood function of x parameterized by \(\theta\) – the parameter … Read more

Cramér-Rao Lower Bound (CRLB)-Scalar Parameter Estimation

Key focus: Discuss scalar parameter estimation using CRLB. Estimate DC component from observed data in the presence of AWGN noise. Consider a set of observed data samples and is the scalar parameter that is to be estimated from the observed samples. The accuracy of the estimate depends on how well the observed data is influenced … Read more

Cramér-Rao Lower Bound: Introduction

Key concept: Cramér-Rao bound is the lower bound on variance of unbiased estimators that estimate deterministic parameters. Introduction The criteria for existence of having an Minimum Variance Unbiased Estimator (MVUE) was discussed in a previous article. To have an MVUE, it is necessary to have estimates that are unbiased and that give minimum variance (compared … Read more

Score, Fisher Information and Estimator Sensitivity

As we have seen in the previous articles, that the estimation of a parameter from a set of data samples depends strongly on the underlying PDF. The accuracy of the estimation is inversely proportional to the variance of the underlying PDF. That is, less the variance of PDF more is the accuracy of estimation and vice … Read more

Theoretical derivation of MLE for Gaussian Distribution:

As a pre-requisite, check out the previous article on the logic behind deriving the maximum likelihood estimator for a given PDF. Let X=(x1,x2,…, xN) are the samples taken from Gaussian distribution given by Calculating the Likelihood The log likelihood is given by, Differentiating and equating to zero to find the maxim (otherwise equating the score … Read more

Theoretical derivation of MLE for Exponential Distribution:

As a pre-requisite, check out the previous article on the logic behind deriving the maximum likelihood estimator for a given PDF. Let X=(x1,x2,…, xN) are the samples taken from Exponential distribution given by Calculating the Likelihood The log likelihood is given by, Differentiating and equating to zero to find the maxim (otherwise equating the score … Read more