IDEAS home Printed from https://ideas.repec.org/a/eee/intfor/v38y2022i2p688-704.html

What do forecasting rationales reveal about thinking patterns of top geopolitical forecasters?

Author

Listed:
  • Karvetski, Christopher W.
  • Meinel, Carolyn
  • Maxwell, Daniel T.
  • Lu, Yunzi
  • Mellers, Barbara A.
  • Tetlock, Philip E.

Abstract

Geopolitical forecasting tournaments have stimulated the development of methods for improving probability judgments of real-world events. But these innovations have focused on easier-to-quantify variables, like personnel selection, training, teaming, and crowd aggregation—bypassing messier constructs, like qualitative properties of forecasters’ rationales. Here, we adapt methods from natural language processing (NLP) and computational text analysis to identify distinctive reasoning strategies in the rationales of top forecasters, including: (a) cognitive styles, such as dialectical complexity, that gauge tolerance of clashing perspectives and efforts to blend them into coherent conclusions and (b) the use of comparison classes or base rates to inform forecasts. In addition to these core metrics, we explore metrics derived from the Linguistic Inquiry and Word Count (LIWC) program. Applying these tools to multiple tournaments and to forecasters of widely varying skill (from Mechanical Turkers to carefully culled “superforecasters”) revealed that: (a) top forecasters show higher dialectical complexity in their rationales and use more comparison classes; (b) experimental interventions, like training and teaming, that boost accuracy also influence NLP profiles of rationales, nudging them in a “superforecaster” direction.

Suggested Citation

  • Karvetski, Christopher W. & Meinel, Carolyn & Maxwell, Daniel T. & Lu, Yunzi & Mellers, Barbara A. & Tetlock, Philip E., 2022. "What do forecasting rationales reveal about thinking patterns of top geopolitical forecasters?," International Journal of Forecasting, Elsevier, vol. 38(2), pages 688-704.
  • Handle: RePEc:eee:intfor:v:38:y:2022:i:2:p:688-704
    DOI: 10.1016/j.ijforecast.2021.09.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0169207021001473
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.ijforecast.2021.09.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Edgar C. Merkle & Mark Steyvers, 2013. "Choosing a Strictly Proper Scoring Rule," Decision Analysis, INFORMS, vol. 10(4), pages 292-304, December.
    2. Jonathan Baron & Barbara A. Mellers & Philip E. Tetlock & Eric Stone & Lyle H. Ungar, 2014. "Two Reasons to Make Aggregated Probability Forecasts More Extreme," Decision Analysis, INFORMS, vol. 11(2), pages 133-145, June.
    3. Victor Richmond R. Jose & Robert F. Nau & Robert L. Winkler, 2009. "Sensitivity to Distance and Baseline Distributions in Forecast Evaluation," Management Science, INFORMS, vol. 55(4), pages 582-590, April.
    4. Don A. Moore & Samuel A. Swift & Angela Minster & Barbara Mellers & Lyle Ungar & Philip Tetlock & Heather H. J. Yang & Elizabeth R. Tenney, 2017. "Confidence Calibration in a Multiyear Geopolitical Forecasting Competition," Management Science, INFORMS, vol. 63(11), pages 3552-3565, November.
    5. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    6. Atanasov, Pavel & Witkowski, Jens & Ungar, Lyle & Mellers, Barbara & Tetlock, Philip, 2020. "Small steps to accuracy: Incremental belief updaters are better forecasters," Organizational Behavior and Human Decision Processes, Elsevier, vol. 160(C), pages 19-35.
    7. Katsagounos, Ilias & Thomakos, Dimitrios D. & Litsiou, Konstantia & Nikolopoulos, Konstantinos, 2021. "Superforecasting reality check: Evidence from a small pool of experts and expedited identification," European Journal of Operational Research, Elsevier, vol. 289(1), pages 107-117.
    8. Goldstein, Daniel G. & Gigerenzer, Gerd, 2009. "Fast and frugal forecasting," International Journal of Forecasting, Elsevier, vol. 25(4), pages 760-772, October.
    9. Barbara A. Mellers & Joshua D. Baker & Eva Chen & David R. Mandel & Philip E. Tetlock, 2017. "How generalizable is good judgment? A multi-task, multi-benchmark study," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 12(4), pages 369-381, July.
    10. J. Scott Armstrong, 2005. "The Forecasting Canon: Nine Generalizations to Improve Forecast Accuracy," Foresight: The International Journal of Applied Forecasting, International Institute of Forecasters, issue 1, pages 29-35, June.
    11. Eva Chen & David V. Budescu & Shrinidhi K. Lakshmikanth & Barbara A. Mellers & Philip E. Tetlock, 2016. "Validating the Contribution-Weighted Model: Robustness and Cost-Benefit Analyses," Decision Analysis, INFORMS, vol. 13(2), pages 128-152, June.
    12. Welton Chang & Pavel Atanasov & Shefali Patil & Barbara A. Mellers & Philip E. Tetlock, 2017. "Accountability and adaptive performance under uncertainty: A long-term view," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 12(6), pages 610-626, November.
    13. Satopää, Ville A. & Baron, Jonathan & Foster, Dean P. & Mellers, Barbara A. & Tetlock, Philip E. & Ungar, Lyle H., 2014. "Combining multiple probability predictions using a simple logit model," International Journal of Forecasting, Elsevier, vol. 30(2), pages 344-356.
    14. Yuanchao Emily Bo & David V. Budescu & Charles Lewis & Philip E. Tetlock & Barbara Mellers, 2017. "An IRT forecasting model: linking proper scoring rules to item response theory," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 12(2), pages 90-103, March.
    15. Welton Chang & Eva Chen & Barbara Mellers & Philip Tetlock, 2016. "Developing expert political judgment: The impact of training and practice on judgmental accuracy in geopolitical forecasting tournaments," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 11(5), pages 509-526, September.
    16. Daniel Cross & Jaime Ramos & Barbara Mellers & Philip E. Tetlock & David W. Scott, 2018. "Robust forecast aggregation: Fourier L2E regression," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 37(3), pages 259-268, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Heidemann, Gerrit & Schmidt, Sascha L. & von der Gracht, Heiko A. & Beiderbeck, Daniel, 2024. "The impact of the metaverse on the future business of professional football clubs – A prospective study," Technological Forecasting and Social Change, Elsevier, vol. 208(C).
    2. Shinitzky, Hilla & Shemesh, Yhonatan & Leiser, David & Gilead, Michael, 2024. "Improving geopolitical forecasts with 100 brains and one computer," International Journal of Forecasting, Elsevier, vol. 40(3), pages 958-970.
    3. Schuler, Benedikt Alexander & Murmann, Johann Peter & Beisemann, Marie & Satopää, Ville, 2025. "Individual foresight: Concept, operationalization, and correlates," International Journal of Forecasting, Elsevier, vol. 41(4), pages 1521-1538.
    4. Atanasov, Pavel & Witkowski, Jens & Mellers, Barbara & Tetlock, Philip, 2025. "Crowd prediction systems: Markets, polls, and elite forecasters," International Journal of Forecasting, Elsevier, vol. 41(2), pages 580-595.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Schuler, Benedikt Alexander & Murmann, Johann Peter & Beisemann, Marie & Satopää, Ville, 2025. "Individual foresight: Concept, operationalization, and correlates," International Journal of Forecasting, Elsevier, vol. 41(4), pages 1521-1538.
    2. Karimi Motahhar, Vahid & Gruca, Thomas S., 2025. "How does training improve individual forecasts? Modeling differences in compensatory and non-compensatory biases in geopolitical forecasts," International Journal of Forecasting, Elsevier, vol. 41(2), pages 487-498.
    3. Edgar C. Merkle & Robert Hartman, 2018. "Weighted Brier score decompositions for topically heterogenous forecasting tournaments," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 13(2), pages 185-201, March.
    4. Atanasov, Pavel & Witkowski, Jens & Mellers, Barbara & Tetlock, Philip, 2025. "Crowd prediction systems: Markets, polls, and elite forecasters," International Journal of Forecasting, Elsevier, vol. 41(2), pages 580-595.
    5. David R. Mandel & Daniel Irwin, 2021. "Tracking accuracy of strategic intelligence forecasts: Findings from a long‐term Canadian study," Futures & Foresight Science, John Wiley & Sons, vol. 3(3-4), September.
    6. Ross Gruetzemacher & Kang Bok Lee & David Paradice, 2024. "Calibration training for improving probabilistic judgments using an interactive app," Futures & Foresight Science, John Wiley & Sons, vol. 6(2), June.
    7. Hassoun, Zane & MacKay, Niall & Powell, Ben, 2026. "Kairosis: A method for dynamical probability forecast aggregation informed by Bayesian change-point detection," International Journal of Forecasting, Elsevier, vol. 42(1), pages 112-125.
    8. Philip E. Tetlock & Christopher Karvetski & Ville A. Satopää & Kevin Chen, 2024. "Long‐range subjective‐probability forecasts of slow‐motion variables in world politics: Exploring limits on expert judgment," Futures & Foresight Science, John Wiley & Sons, vol. 6(1), March.
    9. Ying Han & David Budescu, 2019. "A universal method for evaluating the quality of aggregators," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 14(4), pages 395-411, July.
    10. Ville A. Satopää & Marat Salikhov & Philip E. Tetlock & Barbara Mellers, 2021. "Bias, Information, Noise: The BIN Model of Forecasting," Management Science, INFORMS, vol. 67(12), pages 7599-7618, December.
    11. Satopää, Ville A., 2021. "Improving the wisdom of crowds with analysis of variance of predictions of related outcomes," International Journal of Forecasting, Elsevier, vol. 37(4), pages 1728-1747.
    12. Ho, Emily H. & Budescu, David V. & Himmelstein, Mark, 2025. "Measuring probabilistic coherence to identify superior forecasters," International Journal of Forecasting, Elsevier, vol. 41(2), pages 596-612.
    13. Satopää, Ville A. & Salikhov, Marat & Tetlock, Philip E. & Mellers, Barbara, 2023. "Decomposing the effects of crowd-wisdom aggregators: The bias–information–noise (BIN) model," International Journal of Forecasting, Elsevier, vol. 39(1), pages 470-485.
    14. Atanasov, Pavel & Witkowski, Jens & Ungar, Lyle & Mellers, Barbara & Tetlock, Philip, 2020. "Small steps to accuracy: Incremental belief updaters are better forecasters," Organizational Behavior and Human Decision Processes, Elsevier, vol. 160(C), pages 19-35.
    15. Patrick Afflerbach & Christopher Dun & Henner Gimpel & Dominik Parak & Johannes Seyfried, 2021. "A Simulation-Based Approach to Understanding the Wisdom of Crowds Phenomenon in Aggregating Expert Judgment," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 63(4), pages 329-348, August.
    16. Peker, Cem & Wilkening, Tom, 2025. "Robust recalibration of aggregate probability forecasts using meta-beliefs," International Journal of Forecasting, Elsevier, vol. 41(2), pages 613-630.
    17. Ville A. Satopää & Robin Pemantle & Lyle H. Ungar, 2016. "Modeling Probability Forecasts via Information Diversity," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1623-1633, October.
    18. Wheatcroft Edward, 2021. "Evaluating probabilistic forecasts of football matches: the case against the ranked probability score," Journal of Quantitative Analysis in Sports, De Gruyter, vol. 17(4), pages 273-287, December.
    19. Constantinou Anthony Costa & Fenton Norman Elliott, 2012. "Solving the Problem of Inadequate Scoring Rules for Assessing Probabilistic Football Forecast Models," Journal of Quantitative Analysis in Sports, De Gruyter, vol. 8(1), pages 1-14, March.
    20. David R. Mandel, 2020. "Studies past and future of the past and future: Commentary on Schoemaker 2020," Futures & Foresight Science, John Wiley & Sons, vol. 2(3-4), September.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:intfor:v:38:y:2022:i:2:p:688-704. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ijforecast .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.