Non-Probabilistic Decision Making with Memory Constraints
In the model of choice, studied in this paper, the decision maker chooses the actions non-probabilistically in each period (Sarin and Vahid, 1999; Sarin, 2000). The action is chosen if it yields the biggest payoff according to the decision maker’s subjective assessment. Decision maker knows nothing about the process that generates the payoffs. If the decision maker remembers only recent payoffs, she converges to the maximin action. If she remembers all past payoffs, the maximal expected payoff action is chosen. These results hold for any possible dynamics of weights and are robust against the mistakes. The estimates of the rate of convergence reveal that in some important cases the convergence to the asymptotic behavior can take extremely long time. The model suggests simple experimental test of the way people memorize past experiences: if any weighted procedure is actually involved, it can possibly generate only two distinct modes of behavior.
|Date of creation:||Mar 2005|
|Date of revision:||Jul 2007|
|Contact details of provider:|| Postal: |
Web page: http://mpra.ub.uni-muenchen.de
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Sarin, Rajiv & Vahid, Farshid, 1999. "Payoff Assessments without Probabilities: A Simple Dynamic Model of Choice," Games and Economic Behavior, Elsevier, vol. 28(2), pages 294-309, August.
- Huck Steffen & Sarin Rajiv, 2004.
"Players With Limited Memory,"
The B.E. Journal of Theoretical Economics,
De Gruyter, vol. 4(1), pages 1-27, September.
- Steffen Huck & Rajiv Sarin, 2000. "Players with Limited Memory," Econometric Society World Congress 2000 Contributed Papers 1645, Econometric Society.
- Sarin, Rajiv, 2000. "Decision Rules with Bounded Memory," Journal of Economic Theory, Elsevier, vol. 90(1), pages 151-160, January.
When requesting a correction, please mention this item's handle: RePEc:pra:mprapa:2653. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Ekkehart Schlicht)
If references are entirely missing, you can add them using this form.