logo
Probability and Mathematical Statistics

Parameter Estimation

This chapter systematically studies the basic ideas of parameter estimation, point estimation, interval estimation, the method of moments, the method of maximum likelihood, and the properties of estimators.

Point Estimation and Estimators

  • Point estimation: using sample statistics to estimate the value of population parameters
  • Estimator: a statistic used to estimate a parameter
  • Estimate: the specific value of the estimator

Method of Moments and Method of Maximum Likelihood

  • Method of moments: set the sample moments equal to the population moments, solve the equations to obtain the estimator
  • Method of maximum likelihood: construct the likelihood function, maximize it to obtain the estimator

Properties of Estimators

  • Unbiasedness: E(θ^)=θE(\hat{\theta})=\theta
  • Efficiency: minimum variance
  • Consistency: converges to the parameter as the sample size increases

Interval Estimation and Confidence Interval

  • Interval estimation: gives an interval in which the parameter is likely to fall
  • Confidence interval: the probability that the interval contains the parameter is the confidence level
  • Confidence intervals for the mean and variance of a single normal population
  • Confidence intervals for the difference of means and the ratio of variances of two normal populations

Exercises

  1. Let the population XN(μ,σ2)X \sim N(\mu, \sigma^2), the sample mean is X\overline{X}, and the variance is S2S^2. Write the unbiased estimators for μ\mu and σ2\sigma^2.
  2. Use the method of moments to estimate the parameter: given a sample x1,x2,,xnx_1, x_2, \dots, x_n from XPoisson(λ)X \sim Poisson(\lambda).
  3. Use the method of maximum likelihood to estimate the parameters: given a sample x1,x2,,xnx_1, x_2, \dots, x_n from XN(μ,σ2)X \sim N(\mu, \sigma^2).
  4. Write the 1α1-\alpha confidence interval for the mean μ\mu of a normal population (variance known).
  5. Given two estimators θ^1,θ^2\hat{\theta}_1, \hat{\theta}_2, if Var(θ^1)<Var(θ^2)Var(\hat{\theta}_1)<Var(\hat{\theta}_2), which one is more efficient?
Reference Answers

1. Unbiased estimators

The unbiased estimator for μ\mu is X\overline{X}, and for σ2\sigma^2 is S2=1n1(xiX)2S^2 = \frac{1}{n-1}\sum (x_i-\overline{X})^2


2. Method of moments

E(X)=λE(X)=\lambda, set the sample mean equal to λ\lambda, so λ^=X\hat{\lambda}=\overline{X}


3. Method of maximum likelihood

The maximum likelihood estimator for μ\mu is X\overline{X}, and for σ2\sigma^2 is 1n(xiX)2\frac{1}{n}\sum (x_i-\overline{X})^2


4. Confidence interval

X±zα/2σn\overline{X} \pm z_{\alpha/2}\frac{\sigma}{\sqrt{n}}


5. Efficiency

θ^1\hat{\theta}_1 is more efficient