An estimator is said to be unbiased if the mathematical expectation of the estimator is equal to the parameter value. Mathematically we write this as,
E(T) =θ where,
T denotes the sample statistic (estimator)
and, θ denotes the unknown population parameter.
Bias and Biased Estimator:
The difference between the expected value of the statistic and the actual parameter value is called the bias of the estimator, that is,
Bias = E(T)-θ.
If the bias is nonzero then we say that the estimator is unbiased.
Some examples of unbiased estimators:
- The sample mean is an unbiased estimator for the population mean.
- Sample variance is an unbiased estimator for population variance.
- The sample proportion is an unbiased estimator for the population proportion.
What properties should a good point estimate have?
Apart from being unbiased, the point estimate must also have the following properties in order to be considered a “good” estimator:
- Consistency – the statistic should converge in probability to the parameter value.
- Efficiency – the variance of the estimator should be small.
- Sufficiency – the estimator must contain all information in the sample regarding the parameter.
Minimum Variance Unbiased Estimators (MVUE) :
We generally seek to find a minimum variance unbiased estimator for the parameter value.
A given parameter can have many possible unbiased estimators. Then which of these are we supposed to use as our estimate? We use the estimator with the least variance among them all.
Such estimators whose variance is the smallest among all unbiased estimators are called MVUE (Minimum Variance Unbiased Estimators).
There is the well-known Cramer-Rao lower bound for the variance of an unbiased estimator. The estimator for which this lower bound is attained is the desired MVUE.
Thus, if we wish to find MVUE for a parameter we should find an unbiased estimator whose variance is equal to the Cramer-Rao lower bound. The method of maximum likelihood can give us estimators with the least variance.