2.4 Properties of estimators
In estimation theory, one must know the various properties of a good estimator so that he can select appropriate estimators for his study. One must know that a good estimator possesses the following properties:
Unbiasedness: An estimator is said to be unbiased if the expected value of the estimator is equal to the parameter being estimated. An estimator should on the average be equal to the value of the parameter being estimated. This is popularly known as the property of unbiasedness. The sample mean is the most widely used estimator because of the fact that it provides an unbiased estimate of the population mean (). i.e. E( ) =
Consistency: The notation of consistency is mainly concerned with infinite population. An estimator tn = t(x1, x2, ……, xn) computed from a random sample of n values, is said to be a consistent estimator for a population parameter , if it converges in probability to as n tends to infinity. i.e., tn is said to be a consistent estimator of if for every >0, the following condition holds:
lim P{ - < tn < + } = 1
n
or lim P{|+ n - | > }= 0
n
The property of consistency ensures that the difference between tn and would become smaller in probability sense as n increases indefinitely. In other words, it would give increasing accuracy with the increasing of the size of the sample
Efficiency: - Let for large samples two consistent estimators tn and be both distributed asymptotically normally about the true value of the parameter and with variance and respectively. This will usually be the case in virtue of the central limit theorem. Then tn is said to be more efficient estimator than if . i.e. if var (tn) < var ( ) for all n, then tn will be said to be more efficient than for all n. The estimator with the smaller variance will be grouped more closely round the true value and on the average, will deviated less from the true value than the estimator with large variance and thus it may be reasonably regarded as more efficient than the other.
If we can find a consistent estimator tn whose variance is less than that of all other consistent estimators for all n, then tn will be said to be the most efficient and the efficiency E, of any other estimator is defined as the ratio of the variance of the most efficient estimator to the variance of the given estimator. The efficiency of statistic represents, in large samples, the fraction of the relevant information available in the sample, which is utilized by the statistic in question.
Sufficiency: - An estimator tn is said to be sufficient for estimating a population parameter , if it contains all the information in the samples about the parameter. i.e. the estimator tn should use as much as possible the information available from the sample.
Dostları ilə paylaş: |