Intuitions About Combining Opinions: Misappreciation of the Averaging Principle
AbstractAveraging estimates is an effective way to improve accuracy when combining expert judgments, integrating group members' judgments, or using advice to modify personal judgments. If the estimates of two judges ever fall on different sides of the truth, which we term bracketing, averaging must outperform the average judge for convex loss functions, such as mean absolute deviation (MAD). We hypothesized that people often hold incorrect beliefs about averaging, falsely concluding that the average of two judges' estimates would be no more accurate than the average judge. The experiments confirmed that this misconception was common across a range of tasks that involved reasoning from summary data (Experiment 1), from specific instances (Experiment 2), and conceptually (Experiment 3). However, this misconception decreased as observed or assumed bracketing rate increased (all three studies) and when bracketing was made more transparent (Experiment 2). Experiment 4 showed that flawed inferential rules and poor extensional reasoning abilities contributed to the misconception. We conclude by describing how people may face few opportunities to learn the benefits of averaging and how misappreciating averaging contributes to poor intuitive strategies for combining estimates.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoArticle provided by INFORMS in its journal Management Science.
Volume (Year): 52 (2006)
Issue (Month): 1 (January)
averaging opinions; combining forecasts; information aggregation; advice taking; heuristics and biases;
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Graefe, Andreas & Armstrong, J. Scott & Jones, Randall J. & Cuzán, Alfred G., 2014. "Combining forecasts: An application to elections," International Journal of Forecasting, Elsevier, vol. 30(1), pages 43-54.
- Ilan Fischer & Ravid Bogaire, 2012. "The Group Calibration Index: a group-based approach for assessing forecasters’ expertise when external outcome data are missing," Theory and Decision, Springer, vol. 73(4), pages 671-685, October.
- Gino, Francesca, 2008. "Do we listen to advice just because we paid for it? The impact of advice cost on its use," Organizational Behavior and Human Decision Processes, Elsevier, vol. 107(2), pages 234-245, November.
- Yaniv, Ilan & Choshen-Hillel, Shoham & Milyavsky, Maxim, 2011. "Receiving advice on matters of taste: Similarity, majority influence, and taste discrimination," Organizational Behavior and Human Decision Processes, Elsevier, vol. 115(1), pages 111-120, May.
- Aurélien Baillon & Laure Cabantous & Peter Wakker, 2012. "Aggregating imprecise or conflicting beliefs: An experimental investigation using modern ambiguity theories," Journal of Risk and Uncertainty, Springer, vol. 44(2), pages 115-147, April.
- Yannick Viossat & Andriy Zapechelnyuk, 2013.
"No-regret Dynamics and Fictitious Play,"
- Kameda, Tatsuya & Tsukasaki, Takafumi & Hastie, Reid & Berg, Nathan, 2010. "Democracy under uncertainty: The ‘wisdom of crowds’ and the free-rider problem in group decision making," MPRA Paper 26584, University Library of Munich, Germany.
- Karsten Hueffer & Miguel A. Fonseca & Anthony Leiserowitz & Karen M. Taylor, 2013. "The wisdom of crowds: Predicting a weather and climate-related event," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 8(2), pages 91-105, March.
- Hau, Robin & Hertwig, Ralph & Roth, Alvin E. & Stewart, Terrence & West, Robert & Lebiere, Christian & Erev, Ido & Ert, Eyal & Haruvy, Ernan & Herzog, Stefan, 2009. "A Choice Prediction Competition: Choices From Experience and From Description," Scholarly Articles 5343169, Harvard University Department of Economics.
- Ilan Yaniv & Shoham Choshen-Hillel, 2012. "When guessing what another person would say is better than giving your own opinion: Using perspective-taking to improve advice-taking," Discussion Paper Series dp622, The Center for the Study of Rationality, Hebrew University, Jerusalem.
- Soll, Jack B. & Mannes, Albert E., 2011. "Judgmental aggregation strategies depend on whether the self is involved," International Journal of Forecasting, Elsevier, vol. 27(1), pages 81-102, January.
- See, Kelly E. & Morrison, Elizabeth W. & Rothman, Naomi B. & Soll, Jack B., 2011. "The detrimental effects of power on confidence, advice taking, and accuracy," Organizational Behavior and Human Decision Processes, Elsevier, vol. 116(2), pages 272-285.
- Yaniv, Ilan & Milyavsky, Maxim, 2007. "Using advice from multiple sources to revise and improve judgments," Organizational Behavior and Human Decision Processes, Elsevier, vol. 103(1), pages 104-120, May.
- repec:hal:wpaper:hal-00713871 is not listed on IDEAS
- Armstrong, J. Scott & Green, Kesten C. & Graefe, Andreas, 2014. "Golden Rule of Forecasting: Be conservative," MPRA Paper 53579, University Library of Munich, Germany.
- Ilan Yaniv & Shoham Choshen-Hillel & Maxim Milyavsky, 2008. "Spurious Consensus and Opinion Revision: Why Might People Be More Confident in Their Less Accurate Judgments?," Discussion Paper Series dp492, The Center for the Study of Rationality, Hebrew University, Jerusalem.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Mirko Janc).
If references are entirely missing, you can add them using this form.