Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning
AbstractWe show that the way in which the Shannon entropy of sequences produced by an information source converges to the source's entropy rate can be used to monitor how an intelligent agent builds and effectively uses a predictive model of its environment. We introduce natural measures of the environment's apparent memory and the amounts of information that must be (i) extracted from observations for an agent to synchronize to the environment and (ii) stored by an agent for optimal prediction. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.
Download InfoTo our knowledge, this item is not available for download. To find whether it is available, there are three options:
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
Bibliographic InfoPaper provided by Santa Fe Institute in its series Working Papers with number 01-03-020.
Date of creation: Mar 2001
Date of revision:
Contact details of provider:
Postal: 1399 Hyde Park Road, Santa Fe, New Mexico 87501
Web page: http://www.santafe.edu/sfi/publications/working-papers.html
More information through EDIRC
You can help add them by filling out this form.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Thomas Krichel).
If references are entirely missing, you can add them using this form.