IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/67sak.html
   My bibliography  Save this paper

The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge

Author

Listed:
  • Fanelli, Daniele

Abstract

Scientists' ability to integrate diverse forms of evidence and evaluate how well they can explain and predict phenomena, in other words, $\textit{to know how much they know}$, struggles to keep pace with technological innovation. Central to the challenge of extracting knowledge from data is the need to develop a metric of knowledge itself. A candidate metric of knowledge, $K$, was recently proposed by the author. This essay further advances and integrates that proposal, by developing a methodology to measure its key variable, symbolized with the Greek letter $\tau$ ("tau"). It will be shown how a $\tau$ can represent the description of any phenomenon, any theory to explain it, and any methodology to study it, allowing the knowledge about that phenomenon to be measured with $K$. To illustrate potential applications, the essay calculates $\tau$ and $K$ values of: logical syllogisms and proofs, mathematical calculations, empirical quantitative knowledge, statistical model selection problems, including how to correct for "forking paths" and "P-hacking" biases, randomised controlled experiments, reproducibility and replicability, qualitative analyses via process tracing, and mixed quantitative and qualitative evidence. Whilst preliminary in many respects, these results suggest that $K$ theory offers a meaningful understanding of knowledge, which makes testable metascientific predictions, and which may be used to analyse and integrate qualitative and quantitative evidence to tackle complex problems.

Suggested Citation

  • Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
  • Handle: RePEc:osf:metaar:67sak
    DOI: 10.31219/osf.io/67sak
    as

    Download full text from publisher

    File URL: https://osf.io/download/61d7ffe5da63201206fe6b5a/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/67sak?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Rubin, Mark, 2020. "Does preregistration improve the credibility of research findings?," MetaArXiv vgr89, Center for Open Science.
    2. Bart Penders & J. Britt Holbrook & Sarah de Rijcke, 2019. "Rinse and Repeat: Understanding the Value of Replication across Different Ways of Knowing," Publications, MDPI, vol. 7(3), pages 1-15, July.
    3. Fairfield, Tasha & Charman, Andrew, 2017. "Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats," LSE Research Online Documents on Economics 69203, London School of Economics and Political Science, LSE Library.
    4. Freese, Jeremy & Peterson, David, 2017. "Replication in Social Science," SocArXiv 5bck9, Center for Open Science.
    5. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    6. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    7. Fairfield, Tasha & Charman, Andrew E., 2017. "Explicit Bayesian Analysis for Process Tracing: Guidelines, Opportunities, and Caveats," Political Analysis, Cambridge University Press, vol. 25(3), pages 363-380, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    2. Alejandro Avenburg & John Gerring & Jason Seawright, 2023. "How do social scientists reach causal inferences? A study of reception," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 257-275, February.
    3. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    4. Martin Rabbia, 2023. "Why did Argentina and Uruguay decide to pursue a carbon tax? Fiscal reforms and explicit carbon prices," Review of Policy Research, Policy Studies Organization, vol. 40(2), pages 230-259, March.
    5. Fernández Milmanda, Belén & Garay, Candelaria, 2019. "Subnational variation in forest protection in the Argentine Chaco," World Development, Elsevier, vol. 118(C), pages 79-90.
    6. Brandão, Frederico & Befani, Barbara & Soares-Filho, Jaílson & Rajão, Raoni & Garcia, Edenise, 2023. "How to halt deforestation in the Amazon? A Bayesian process-tracing approach," Land Use Policy, Elsevier, vol. 133(C).
    7. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Management Science Reproducibility Collaboration, 2023. "Reproducibility in Management Science," Department for Strategy and Innovation Working Paper Series 03/2023, WU Vienna University of Economics and Business.
    8. Fairfield, Tasha & Charman, Andrew, 2019. "A Dialogue with the Data: the Bayesian foundations of iterative research in qualitative social science," LSE Research Online Documents on Economics 89261, London School of Economics and Political Science, LSE Library.
    9. Alvarado, Miriam & Penney, Tarra L. & Unwin, Nigel & Murphy, Madhuvanti M. & Adams, Jean, 2021. "Evidence of a health risk ‘signalling effect’ following the introduction of a sugar-sweetened beverage tax," Food Policy, Elsevier, vol. 102(C).
    10. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    11. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    12. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    13. David A. Bateman & Dawn Langan Teele, 2020. "A developmental approach to historical causal inference," Public Choice, Springer, vol. 185(3), pages 253-279, December.
    14. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    15. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," I4R Discussion Paper Series 38, The Institute for Replication (I4R).
    16. Amengual, Matthew, 2018. "Buying stability: The distributive outcomes of private politics in the Bolivian mining industry," World Development, Elsevier, vol. 104(C), pages 31-45.
    17. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    18. Blair, Graeme & Cooper, Jasper & Coppock, Alexander & Humphreys, Macartan, 2019. "Declaring and Diagnosing Research Designs," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 113(3), pages 838-859.
    19. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    20. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:67sak. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.