IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v126y2021i5d10.1007_s11192-021-03914-1.html
   My bibliography  Save this article

Generalizing research findings for enhanced reproducibility: an approach based on verbal alternative representations

Author

Listed:
  • Ron S. Kenett

    (The Hebrew University of Jerusalem
    The KPA Group
    Samuel Neaman Institute)

  • Abraham Rubinstein

    (The Hebrew University of Jerusalem)

Abstract

Research aims at generating research claims. The paper introduces a "border of meaning", abbreviated BOM, as a mode of representation of research findings that supplements statistical tests. The suggested approach was originally developed in a pedagogical context of promoting conceptual understanding in education. Here we aim at helping better understand research claims stated in a scientific paper. Considering new approaches to the presentation of findings, has an impact on the reproducibility of research. The BOM approach is demonstrated using examples from clinical research and translational medicine. Specifically, we map research findings into a list that delineates a demarcation line between alternative representations of the research claims, some with meaning equivalence and some with surface similarity. Such a mapping can be statistically evaluated by sin type error tests. Our main message is that findings should be presented and generalized with a BOM.

Suggested Citation

  • Ron S. Kenett & Abraham Rubinstein, 2021. "Generalizing research findings for enhanced reproducibility: an approach based on verbal alternative representations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4137-4151, May.
  • Handle: RePEc:spr:scient:v:126:y:2021:i:5:d:10.1007_s11192-021-03914-1
    DOI: 10.1007/s11192-021-03914-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-021-03914-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-021-03914-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ron S. Kenett & Galit Shmueli, 2014. "On information quality," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 177(1), pages 3-38, January.
    2. Todd A. Kuffner & Stephen G. Walker, 2019. "Why are p-Values Controversial?," The American Statistician, Taylor & Francis Journals, vol. 73(1), pages 1-3, January.
    3. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pierpaolo D’Urso & Vincenzina Vitale, 2021. "Modeling Local BES Indicators by Copula-Based Bayesian Networks," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 153(3), pages 823-847, February.
    2. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    3. Antonella Lanati & Marinella Marzano & Caterina Manzari & Bruno Fosso & Graziano Pesole & Francesca De Leo, 2019. "Management at the service of research: ReOmicS, a quality management system for omics sciences," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-13, December.
    4. Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    5. Bor Luen Tang, 2023. "Some Insights into the Factors Influencing Continuous Citation of Retracted Scientific Papers," Publications, MDPI, vol. 11(4), pages 1-14, October.
    6. Cheng, Yuanyuan, 2023. "A method of 3R to evaluate the correlation and predictive value of variables," OSF Preprints c79tu, Center for Open Science.
    7. Rosenblatt, Lucas & Herman, Bernease & Holovenko, Anastasia & Lee, Wonkwon & Loftus, Joshua & McKinnie, Elizabeth & Rumezhak, Taras & Stadnik, Andrii & Howe, Bill & Stoyanovich, Julia, 2023. "Epistemic parity: reproducibility as an evaluation metric for differential privacy," LSE Research Online Documents on Economics 120493, London School of Economics and Political Science, LSE Library.
    8. Inga Patarčić & Jadranka Stojanovski, 2022. "Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals," Publications, MDPI, vol. 10(4), pages 1-10, November.
    9. Susanne Wieschowski & Svenja Biernot & Susanne Deutsch & Silke Glage & André Bleich & René Tolba & Daniel Strech, 2019. "Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres," PLOS ONE, Public Library of Science, vol. 14(11), pages 1-8, November.
    10. Galit Shmueli, 2020. "Discussion on “Assessing the goodness of fit of logistic regression models in large samples: A modification of the Hosmer‐Lemeshow test” by Giovanni Nattino, Michael L. Pennell, and Stanley Lemeshow," Biometrics, The International Biometric Society, vol. 76(2), pages 561-563, June.
    11. Shinichi Nakagawa & Edward R. Ivimey-Cook & Matthew J. Grainger & Rose E. O’Dea & Samantha Burke & Szymon M. Drobniak & Elliot Gould & Erin L. Macartney & April Robin Martinig & Kyle Morrison & Matthi, 2023. "Method Reporting with Initials for Transparency (MeRIT) promotes more granularity and accountability for author contributions," Nature Communications, Nature, vol. 14(1), pages 1-5, December.
    12. Paul J. Ferraro & J. Dustin Tracy, 2022. "A reassessment of the potential for loss-framed incentive contracts to increase productivity: a meta-analysis and a real-effort experiment," Experimental Economics, Springer;Economic Science Association, vol. 25(5), pages 1441-1466, November.
    13. Brian M. Schilder & Alan E. Murphy & Nathan G. Skene, 2024. "rworkflows: automating reproducible practices for the R community," Nature Communications, Nature, vol. 15(1), pages 1-10, December.
    14. Paola Zola & Paulo Cortez & Costantino Ragno & Eugenio Brentari, 2019. "Social Media Cross-Source and Cross-Domain Sentiment Classification," International Journal of Information Technology & Decision Making (IJITDM), World Scientific Publishing Co. Pte. Ltd., vol. 18(05), pages 1469-1499, September.
    15. Tim Hulsen, 2020. "Sharing Is Caring—Data Sharing Initiatives in Healthcare," IJERPH, MDPI, vol. 17(9), pages 1-12, April.
    16. Chris H. J. Hartgerink & Marino Van Zelst, 2018. "“As-You-Go” Instead of “After-the-Fact”: A Network Approach to Scholarly Communication and Evaluation," Publications, MDPI, vol. 6(2), pages 1-10, April.
    17. Lingjing Jiang & Niina Haiminen & Anna‐Paola Carrieri & Shi Huang & Yoshiki Vázquez‐Baeza & Laxmi Parida & Ho‐Cheol Kim & Austin D. Swafford & Rob Knight & Loki Natarajan, 2022. "Utilizing stability criteria in choosing feature selection methods yields reproducible results in microbiome data," Biometrics, The International Biometric Society, vol. 78(3), pages 1155-1167, September.
    18. Joshua Borycz & Robert Olendorf & Alison Specht & Bruce Grant & Kevin Crowston & Carol Tenopir & Suzie Allard & Natalie M. Rice & Rachael Hu & Robert J. Sandusky, 2023. "Perceived benefits of open data are improving but scientists still lack resources, skills, and rewards," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-12, December.
    19. Biemer Paul & Trewin Dennis & Bergdahl Heather & Japec Lilli, 2014. "A System for Managing the Quality of Official Statistics," Journal of Official Statistics, Sciendo, vol. 30(3), pages 1-35, September.
    20. Paul-Martin Luc & Simon Bauer & Julia Kowal, 2022. "Reproducible Production of Lithium-Ion Coin Cells," Energies, MDPI, vol. 15(21), pages 1-16, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:126:y:2021:i:5:d:10.1007_s11192-021-03914-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.