Unique additive information measures—Boltzmann–Gibbs–Shannon, Fisher and beyond
It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann–Gibbs–Shannon and Fisher information measures. Power-law equilibrium distributions are found as a result of the interaction of the two terms. The case of second order derivative dependence is investigated and a corresponding additive information measure is given.
Volume (Year): 365 (2006)
Issue (Month): 1 ()
|Contact details of provider:|| Web page: http://www.journals.elsevier.com/physica-a-statistical-mechpplications/|
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Kaniadakis, G. & Lissia, M. & Scarfone, A.M., 2004. "Deformed logarithms and entropies," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 340(1), pages 41-49.
- Kaniadakis, G., 2001. "Non-linear kinetics underlying generalized statistics," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 296(3), pages 405-425.
- Plastino, A.R. & Plastino, A., 1995. "Non-extensive statistical mechanics and generalized Fokker-Planck equation," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 222(1), pages 347-354.
When requesting a correction, please mention this item's handle: RePEc:eee:phsmap:v:365:y:2006:i:1:p:28-33. See general information about how to correct material in RePEc.
If references are entirely missing, you can add them using this form.