An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

This paper is to compare the parameter estimation of
the mean in normal distribution by Maximum Likelihood (ML),
Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML
estimator is estimated by the average of data, the Bayes method is
considered from the prior distribution to estimate Bayes estimator,
and MCMC estimator is approximated by Gibbs sampling from
posterior distribution. These methods are also to estimate a parameter
then the hypothesis testing is used to check a robustness of the
estimators. Data are simulated from normal distribution with the true
parameter of mean 2, and variance 4, 9, and 16 when the sample
sizes is set as 10, 20, 30, and 50. From the results, it can be seen
that the estimation of MLE, and MCMC are perceivably different
from the true parameter when the sample size is 10 and 20 with
variance 16. Furthermore, the Bayes estimator is estimated from the
prior distribution when mean is 1, and variance is 12 which showed
the significant difference in mean with variance 9 at the sample size
10 and 20.





References:
[1] Rohatgi, V. K. and Saleh, E. (2001) An Introduction to Probability and
Statistics. John Wiely & Sons, New York.
[2] Gilk, W., Richardson, S. and Spiegelhalter, D. (1996) Markov Chain
Monte Carlo in Practice. Chapman & Hall, London.
[3] Carlin, B. P. and Louis, T. A. (2009) Bayesian Methods for Data analysis.
Florida: CRC Press Taylor & Francis Group
[4] Metropolis, N., Rosenbluth, A., Rosenbluth, M., Teller, A. and Teller,
E. (1953) Equations of State Calculations by Fast Computing Machine.
Journal of Chemical Physics, 21, 1087-1092.
[5] Geman, S. and Geman, D. (1984) Stochastic Relaxation, Gibbs
Distribution and the Bayesian Restoration of Images. IEEE Transactions
on Pattern Analysis and Machine Intelligence, 6 721-741.
[6] Gelfand, A., Hills, S., Racine-Poon, A. and Smith, A. (1990) Illustration
of Bayesian inference in normal data models using Gibbs sampling.
Journal of the American Statistical Association, 85, 972-985.
[7] Ntzoufran, I. (2009) Bayesian Modeling Using WinBUGS. John Wiely &
Sons, New Jersey.
[8] R Development Core Team. (2004) R: A Language and Environment for
Statistical Computing. R Foundation for Statistical Computing, Vienna,
Austria.