IDEAS home Printed from https://ideas.repec.org/a/bpj/sagmbi/v10y2011i1n35.html
   My bibliography  Save this article

Measurement of Evidence and Evidence of Measurement

Author

Listed:
  • Vieland Veronica J

    (The Research Institute at Nationwide Children’s Hospital and The Ohio State University)

  • Hodge Susan E

    (New York State Psychiatric Institute and Columbia University)

Abstract

One important use of statistical methods in application to biological data is measurement of evidence, or assessment of the degree to which data support one or another hypothesis. While there is a small literature on this topic, it seems safe to say that consensus has not yet been reached regarding how best, or most accurately, to measure statistical evidence. Here, we propose considering the problem as a measurement problem, rather than as a statistical problem per se, and we explore the consequences of this shift in perspective. Our arguments here are part of an ongoing research program focused on exploiting deep parallelisms between foundations of thermodynamics and foundations of “evidentialism,” in order to derive an absolute scale for the measurement of evidence, a general framework in the context of which that scale is validated, and the many ancillary benefits that come from having such a framework in place.

Suggested Citation

  • Vieland Veronica J & Hodge Susan E, 2011. "Measurement of Evidence and Evidence of Measurement," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 10(1), pages 1-11, July.
  • Handle: RePEc:bpj:sagmbi:v:10:y:2011:i:1:n:35
    DOI: 10.2202/1544-6115.1682
    as

    Download full text from publisher

    File URL: https://doi.org/10.2202/1544-6115.1682
    Download Restriction: For access to full text, subscription to the journal or payment for the individual article is required.

    File URL: https://libkey.io/10.2202/1544-6115.1682?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kimberly A Walters & Yungui Huang & Marco Azaro & Kathleen Tobin & Thomas Lehner & Linda M Brzustowicz & Veronica J Vieland, 2014. "Meta-Analysis of Repository Data: Impact of Data Regularization on NIMH Schizophrenia Linkage Results," PLOS ONE, Public Library of Science, vol. 9(1), pages 1-8, January.
    2. Veronica J Vieland & Sang-Cheol Seok, 2021. "The PPLD has advantages over conventional regression methods in application to moderately sized genome-wide association studies," PLOS ONE, Public Library of Science, vol. 16(9), pages 1-22, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Thierry Poynard & Dominique Thabut & Mona Munteanu & Vlad Ratziu & Yves Benhamou & Olivier Deckmyn, 2010. "Hirsch Index and Truth Survival in Clinical Research," PLOS ONE, Public Library of Science, vol. 5(8), pages 1-10, August.
    4. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    5. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    6. Stephen Fox, 2016. "Dismantling The Box — Applying Principles For Reducing Preconceptions During Ideation," International Journal of Innovation Management (ijim), World Scientific Publishing Co. Pte. Ltd., vol. 20(06), pages 1-27, August.
    7. Amanda Fitzgerald & Naoise Mac Giollabhui & Louise Dolphin & Robert Whelan & Barbara Dooley, 2018. "Dissociable psychosocial profiles of adolescent substance users," PLOS ONE, Public Library of Science, vol. 13(8), pages 1-16, August.
    8. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    9. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    10. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    11. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    12. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    13. Emilija Stojmenova Duh & Andrej Duh & Uroš Droftina & Tim Kos & Urban Duh & Tanja Simonič Korošak & Dean Korošak, 2019. "Publish-and-Flourish: Using Blockchain Platform to Enable Cooperative Scholarly Communication," Publications, MDPI, vol. 7(2), pages 1-15, May.
    14. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    15. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    16. Anton, Roman, 2014. "Sustainable Intrapreneurship - The GSI Concept and Strategy - Unfolding Competitive Advantage via Fair Entrepreneurship," MPRA Paper 69713, University Library of Munich, Germany, revised 01 Feb 2015.
    17. W. Robert Reed, 2018. "A Primer on the ‘Reproducibility Crisis’ and Ways to Fix It," Australian Economic Review, The University of Melbourne, Melbourne Institute of Applied Economic and Social Research, vol. 51(2), pages 286-300, June.
    18. Dudek, Thomas & Brenøe, Anne Ardila & Feld, Jan & Rohrer, Julia, 2022. "No Evidence That Siblings' Gender Affects Personality across Nine Countries," IZA Discussion Papers 15137, Institute of Labor Economics (IZA).
    19. Kraft-Todd, Gordon T. & Rand, David G., 2021. "Practice what you preach: Credibility-enhancing displays and the growth of open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 164(C), pages 1-10.
    20. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bpj:sagmbi:v:10:y:2011:i:1:n:35. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.degruyter.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.