Interpreting Kullback-Leibler divergence with the Neyman-Pearson lemma
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry.
Volume (Year): 97 (2006)
Issue (Month): 9 (October)
|Contact details of provider:|| Web page: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description|
|Order Information:|| Postal: http://www.elsevier.com/wps/find/supportfaq.cws_home/regional|
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Shinto Eguchi, 2002. "A class of logistic-type discriminant functions," Biometrika, Biometrika Trust, vol. 89(1), pages 1-22, March.
When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:97:y:2006:i:9:p:2034-2040. See general information about how to correct material in RePEc.
If references are entirely missing, you can add them using this form.