An information theoretic approach to statistical dependence: copula information
AbstractWe discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by arXiv.org in its series Papers with number 0911.4207.
Date of creation: Nov 2009
Date of revision:
Publication status: Published in Europ. Phys. Lett. 88 68003 (2009)
Contact details of provider:
Web page: http://arxiv.org/
This paper has been announced in the following NEP Reports:
You can help add them by filling out this form.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (arXiv administrators).
If references are entirely missing, you can add them using this form.