Partial monotonicity of entropy measures
The quantification of entropy has prominence in a diverse range of fields of study including information theory, quantum mechanics, thermodynamics, ecology, evolutionary biology and even sociology. Suppose we interpret the entropy of a random object as a measurement of the uncertainty about its outcome. This measurement is expected to decrease when the object’s outcome is confined into a shrinking interval. Entropies conforming to this intuition are thus sensible and likely useful measures of uncertainty. In this paper, we give a necessary and sufficient condition for the Shannon entropy of an absolutely continuous random variable to be an increasing function of the interval. Similar results are also obtained for the Renyi entropy of absolutely continuous random variables and their convolution.
Volume (Year): 82 (2012)
Issue (Month): 11 ()
|Contact details of provider:|| Web page: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description|
|Order Information:|| Postal: http://www.elsevier.com/wps/find/supportfaq.cws_home/regional|
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Bagnoli, M. & Bergstrom, T., 1989.
"Log-Concave Probability And Its Applications,"
89-23, Michigan - Center for Research on Economic & Social Theory.
- Chen, Jiahua & van Eeden, Constance & Zidek, James, 2010. "Uncertainty and the conditional variance," Statistics & Probability Letters, Elsevier, vol. 80(23-24), pages 1764-1770, December.
- Burdett, Kenneth, 1996. "Truncated means and variances," Economics Letters, Elsevier, vol. 52(3), pages 263-267, September.
When requesting a correction, please mention this item's handle: RePEc:eee:stapro:v:82:y:2012:i:11:p:1935-1940. See general information about how to correct material in RePEc.
If references are entirely missing, you can add them using this form.