Unique additive information measures—Boltzmann–Gibbs–Shannon, Fisher and beyond
AbstractIt is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann–Gibbs–Shannon and Fisher information measures. Power-law equilibrium distributions are found as a result of the interaction of the two terms. The case of second order derivative dependence is investigated and a corresponding additive information measure is given.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoArticle provided by Elsevier in its journal Physica A: Statistical Mechanics and its Applications.
Volume (Year): 365 (2006)
Issue (Month): 1 ()
Contact details of provider:
Web page: http://www.journals.elsevier.com/physica-a-statistical-mechpplications/
Fisher information; Non-extensive statistics; Additivity; Schrödinger–Madelung equation;
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Kaniadakis, G. & Lissia, M. & Scarfone, A.M., 2004. "Deformed logarithms and entropies," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 340(1), pages 41-49.
- Plastino, A.R. & Plastino, A., 1995. "Non-extensive statistical mechanics and generalized Fokker-Planck equation," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 222(1), pages 347-354.
- Kaniadakis, G., 2001. "Non-linear kinetics underlying generalized statistics," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 296(3), pages 405-425.
If references are entirely missing, you can add them using this form.