The Asymptotic Loss of Information for Grouped Data
We study the loss of information (measured in terms of the Kullback-Leibler distance) caused by observing "grouped" data (observing only a discretized version of a continuous random variable). We analyze the asymptotical behaviour of the loss of information as the partition becomes finer. In the case of a univariate observation, we compute the optimal rate of convergence and characterize asymptotically optimal partitions (into intervals). In the multivariate case we derive the asymptotically optimal regular sequences of partitions. Furthermore, we compute the asymptotically optimal transformation of the data, when a sequence of partitions is given. Examples demonstrate the efficiency of the suggested discretizing strategy even for few intervals.
Volume (Year): 67 (1998)
Issue (Month): 1 (October)
|Contact details of provider:|| Web page: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description|
|Order Information:|| Postal: http://www.elsevier.com/wps/find/supportfaq.cws_home/regional|
When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:67:y:1998:i:1:p:99-127. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei)
If references are entirely missing, you can add them using this form.