Parametric density estimation by minimizing nonextensive entropy
In this paper, we consider parametric density estimation based on minimizing the Havrda-Charvat-Tsallis nonextensive entropy. The resulting estimator, called the Maximum Lq-Likelihood estimator (MLqE), is indexed by a single distortion parameter q, which controls the trade-off between bias and variance. The method has two notable special cases. If q tends to 1, the MLqE is the Maximum Likelihood Estimator (MLE). When q = 1=2, the MLqE is a minimum Hellinger distance type of estimator with the perk of avoiding nonparametric techniques and the difficulties of bandwith selection. The MLqE is studied using asymptotic analysis, simulations and real-world data, showing that it conciliates two apparently contrasting needs: effciency and robustness, conditional to a proper choice of q. When the sample size is small or moderate, the MLqE trades bias for variance, resulting in a reduced mean squared error compared to the MLE. At the same time, the MLqE exhibits strong robustness at expense of a slightly reduced effciency in presence of observations discordant with the assumed model. To compute the MLq estimates, a fast and easy-to-implement algorithm based on a reweighting strategy is also supplied.
|Date of creation:||May 2008|
|Date of revision:|
|Contact details of provider:|| Web page: http://www.recent.unimore.it/|
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:mod:recent:016. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()
If references are entirely missing, you can add them using this form.