Dealing with the Complexity of Economic Calculations
This essay is a response to a growing negative literature that suggests that neoclassical economic theories based on hypotheses of rationality and equilibrium are of limited practical relevance because they require an infeasibly large number of calculations. Many of the negative results are translations of abstract complexity bounds from the computer science literature. I show that these bounds do do not constitute proofs that difficult economic calculations are ``impossible'' and discuss the type of hardware and software that can make it possible to solve very hard problems. I discuss four different ways to break the curse of dimensionality of economic problems: 1) by exploiting special structure, 2) by decomposition, 3) by randomization, and 4) by t aking advantage of ``knowledge capital.'' However these four methods may not be enough. I offer some speculations on the role of decentralization for harnessing the power of massively parallel processors. I conjecture that decentralization is an efficient ``operating system'' for organizing large scale computations on massively parallel systems. Economies, immune systems and brains are all types of massively parallel processors that use decentralization to solve difficult computational problems. However knowledge capital, in the form of effective institutions, is necessary to ensure that decentralization leads to effective cooperation rather than anarchy and chaos. I suggest that onereason why economists have had great difficulty computing approximate solutions to detailed models of individual behavior and large scale models of the economy is that they are not using appropriate hardware and software. Economists should structure their computations to mimic the key operating features of brains and economies, using parallel processing and decentralized `agent-based'' modeling strategies to solve more realistic economic models where solutions arise endogenously as ``emergent computations''.
|Date of creation:||28 Oct 1996|
|Date of revision:||21 Oct 1997|
|Note:||TeX file, Postscript version submitted, 45 pages|
|Contact details of provider:|| Web page: http://econwpa.repec.org|
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Gale, Douglas M, 1986. "Bargaining and Competition Part I: Characterization," Econometrica, Econometric Society, vol. 54(4), pages 785-806, July.
- Hans M. Amman & David A. Kendrick, . "Computational Economics," Online economics textbooks, SUNY-Oswego, Department of Economics, number comp1.
- Lewis, Alain A., 1985. "The minimum degree of recursively representable choice functions," Mathematical Social Sciences, Elsevier, vol. 10(2), pages 179-188, October.
- Friedman, Eric J & Oren, Shmuel S, 1995. "The Complexity of Resource Allocation and Price Mechanisms under Bounded Rationality," Economic Theory, Springer, vol. 6(2), pages 225-50, July.
- Nagurney, Anna, 1996. "Parallel computation," Handbook of Computational Economics, in: H. M. Amman & D. A. Kendrick & J. Rust (ed.), Handbook of Computational Economics, edition 1, volume 1, chapter 7, pages 335-404 Elsevier.
- John Rust, 1997.
"Using Randomization to Break the Curse of Dimensionality,"
Econometric Society, vol. 65(3), pages 487-516, May.
- John Rust & Department of Economics & University of Wisconsin, 1994. "Using Randomization to Break the Curse of Dimensionality," Computational Economics 9403001, EconWPA, revised 04 Jul 1994.
- Raymond Board & Peter A. Tinsley, 1996. "Smart systems and simple agents: industry pricing by parallel rules," Finance and Economics Discussion Series 1996-50, Board of Governors of the Federal Reserve System (U.S.).
- Mount, Kenneth & Reiter, Stanley, 1974.
"The informational size of message spaces,"
Journal of Economic Theory,
Elsevier, vol. 8(2), pages 161-192, June.
- Dixon, Peter B. & Parmenter, B.R., 1996. "Computable general equilibrium modelling for policy analysis and forecasting," Handbook of Computational Economics, in: H. M. Amman & D. A. Kendrick & J. Rust (ed.), Handbook of Computational Economics, edition 1, volume 1, chapter 1, pages 3-85 Elsevier.
- Sunder, S., 1992. "Lower Bounds for Efficiency of Surplus Extraction in Double Auctions," GSIA Working Papers 1992-17, Carnegie Mellon University, Tepper School of Business.
- Herbert A. Simon, 1996. "The Sciences of the Artificial, 3rd Edition," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262691914, June.
- Spassimir H. Paskov & Joseph F. Traub, 1995. "Faster Valuation of Financial Derivatives," Working Papers 95-03-034, Santa Fe Institute.
When requesting a correction, please mention this item's handle: RePEc:wpa:wuwpco:9610002. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (EconWPA)
If references are entirely missing, you can add them using this form.