Discriminating Between Weibull and Log-Normal Distributions Based on Kullback-Leibler Divergence
The Weibull and Log-Normal distributions are frequently used in reliability to analyze lifetime (or failure time) data. The ratio of maximized likelihood (RML) has been extensively used in choosing between the two distributions. The Kullback-Leibler information is a measure of uncertainty between two densities. We examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the Weibull or Log-Normal distribution. An advantage of the KLD is that it incorporates entropy of each model. We explain the applicability of the KLD by a real data set and the consistency of the KLD with the RML is established.
Volume (Year): 16 (2012)
Issue (Month): 1 (May)
|Contact details of provider:|| Web page: http://eidergisi.istanbul.edu.tr|
More information through EDIRC
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Gupta, Rameshwar D. & Kundu, Debasis, 2003. "Discriminating between Weibull and generalized exponential distributions," Computational Statistics & Data Analysis, Elsevier, vol. 43(2), pages 179-196, June.
When requesting a correction, please mention this item's handle: RePEc:ist:ancoec:v:16:y:2012:i:1:p:44-54. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Kutluk Kagan Sumer)
If references are entirely missing, you can add them using this form.