IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v99y2008i9p1941-1961.html
   My bibliography  Save this article

Estimation, prediction and the Stein phenomenon under divergence loss

Author

Listed:
  • Ghosh, Malay
  • Mergel, Victor
  • Datta, Gauri Sankar

Abstract

We consider two problems: (1) estimate a normal mean under a general divergence loss introduced in [S. Amari, Differential geometry of curved exponential families -- curvatures and information loss, Ann. Statist. 10 (1982) 357-387] and [N. Cressie, T.R.C. Read, Multinomial goodness-of-fit tests, J. Roy. Statist. Soc. Ser. B. 46 (1984) 440-464] and (2) find a predictive density of a new observation drawn independently of observations sampled from a normal distribution with the same mean but possibly with a different variance under the same loss. The general divergence loss includes as special cases both the Kullback-Leibler and Bhattacharyya-Hellinger losses. The sample mean, which is a Bayes estimator of the population mean under this loss and the improper uniform prior, is shown to be minimax in any arbitrary dimension. A counterpart of this result for predictive density is also proved in any arbitrary dimension. The admissibility of these rules holds in one dimension, and we conjecture that the result is true in two dimensions as well. However, the general Baranchick [A.J. Baranchick, a family of minimax estimators of the mean of a multivariate normal distribution, Ann. Math. Statist. 41 (1970) 642-645] class of estimators, which includes the James-Stein estimator and the Strawderman [W.E. Strawderman, Proper Bayes minimax estimators of the multivariate normal mean, Ann. Math. Statist. 42 (1971) 385-388] class of estimators, dominates the sample mean in three or higher dimensions for the estimation problem. An analogous class of predictive densities is defined and any member of this class is shown to dominate the predictive density corresponding to a uniform prior in three or higher dimensions. For the prediction problem, in the special case of Kullback-Leibler loss, our results complement to a certain extent some of the recent important work of Komaki [F. Komaki, A shrinkage predictive distribution for multivariate normal observations, Biometrika 88 (2001) 859-864] and George, Liang and Xu [E.I. George, F. Liang, X. Xu, Improved minimax predictive densities under Kullbak-Leibler loss, Ann. Statist. 34 (2006) 78-92]. While our proposed approach produces a general class of predictive densities (not necessarily Bayes, but not excluding Bayes predictors) dominating the predictive density under a uniform prior. We show also that various modifications of the James-Stein estimator continue to dominate the sample mean, and by the duality of estimation and predictive density results which we will show, similar results continue to hold for the prediction problem as well.

Suggested Citation

  • Ghosh, Malay & Mergel, Victor & Datta, Gauri Sankar, 2008. "Estimation, prediction and the Stein phenomenon under divergence loss," Journal of Multivariate Analysis, Elsevier, vol. 99(9), pages 1941-1961, October.
  • Handle: RePEc:eee:jmvana:v:99:y:2008:i:9:p:1941-1961
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047-259X(08)00039-0
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. José Manuel Corcuera & Federica Giummolè, 1999. "A Generalized Bayes Rule for Prediction," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 26(2), pages 265-279, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ghosh, Malay & Mergel, Victor, 2009. "On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix," Journal of Multivariate Analysis, Elsevier, vol. 100(10), pages 2331-2336, November.
    2. Malay Ghosh & Tatsuya Kubokawa & Gauri Sankar Datta, 2020. "Density Prediction and the Stein Phenomenon," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 330-352, August.
    3. Malay Ghosh & Tatsuya Kubokawa, 2018. "Hierarchical Empirical Bayes Estimation of Two Sample Means Under Divergence Loss," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(1), pages 70-83, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Essam Al-Hussaini & Abd Ahmad, 2003. "On Bayesian interval prediction of future records," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 12(1), pages 79-99, June.
    2. Tatsuya Kubokawa & Éric Marchand & William E. Strawderman & Jean-Philippe Turcotte, 2012. "Minimaxity in Predictive Density Estimation with Parametric Constraints," CIRJE F-Series CIRJE-F-843, CIRJE, Faculty of Economics, University of Tokyo.
    3. Malay Ghosh & Tatsuya Kubokawa & Gauri Sankar Datta, 2020. "Density Prediction and the Stein Phenomenon," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 330-352, August.
    4. Takemi Yanagimoto & Toshio Ohnishi, 2014. "Permissible boundary prior function as a virtually proper prior density," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(4), pages 789-809, August.
    5. Zhang, Fode & Shi, Yimin & Wang, Ruibing, 2017. "Geometry of the q-exponential distribution with dependent competing risks and accelerated life testing," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 468(C), pages 552-565.
    6. Abdolnasser Sadeghkhani, 2022. "On Improving the Posterior Predictive Distribution of the Difference Between two Independent Poisson Distribution," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 84(2), pages 765-777, November.
    7. Chang, In Hong & Mukerjee, Rahul, 2004. "Asymptotic results on the frequentist mean squared error of generalized Bayes point predictors," Statistics & Probability Letters, Elsevier, vol. 67(1), pages 65-71, March.
    8. Kubokawa, Tatsuya & Marchand, Éric & Strawderman, William E. & Turcotte, Jean-Philippe, 2013. "Minimaxity in predictive density estimation with parametric constraints," Journal of Multivariate Analysis, Elsevier, vol. 116(C), pages 382-397.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:99:y:2008:i:9:p:1941-1961. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.