IDEAS home Printed from https://ideas.repec.org/a/spr/rvmgts/v17y2023i8d10.1007_s11846-022-00602-z.html
   My bibliography  Save this article

Understanding crowdsourcing in science

Author

Listed:
  • Regina Lenart-Gansiniec

    (Jagiellonian University in Krakow)

  • Wojciech Czakon

    (Jagiellonian University in Krakow)

  • Łukasz Sułkowski

    (Jagiellonian University in Krakow)

  • Jasna Pocek

    (Free University of Bozen-Bolzano
    Blekinge Institute of Technology)

Abstract

Over the past 16 years, the concept of crowdsourcing has rapidly gained traction across many research fields. While related debates focused mainly on its importance for business, the public and non-governmental sectors, its relevance for generating scientific knowledge is increasingly emphasized. This rising interest remains in contradiction with its feeble recognition, and excessive simplifications reducing crowdsourcing in science to citizen science. Conceptual clarity and a coherent framework would help integrate the various research streams. The aim of this paper is to extend reflection on crowdsourcing in science by analyzing the characteristics of the phenomenon. We synthesize a consensual definition from the literature, and structure key characteristics into a coherent framework, useful in guiding further research. We use a systematic literature review procedure to generate a pool of 42 definitions from a comprehensive set of 62 articles spanning different literatures, including: business and economics, education, psychology, biology, and communication studies. We follow a mixed-method approach that combines bibliometric and frequency analyses with deductive coding and thematic analysis. Based on triangulated results we develop an integrative definition: crowdsourcing in science is a collaborative online process through which scientists involve a group of self-selected individuals of varying, diverse knowledge and skills, via an open call to the Internet and/or online platforms, to undertake a specified research task or set of tasks. We also provide a conceptual framework that covers four key characteristics: initiator, crowd, process, and technology.

Suggested Citation

  • Regina Lenart-Gansiniec & Wojciech Czakon & Łukasz Sułkowski & Jasna Pocek, 2023. "Understanding crowdsourcing in science," Review of Managerial Science, Springer, vol. 17(8), pages 2797-2830, November.
  • Handle: RePEc:spr:rvmgts:v:17:y:2023:i:8:d:10.1007_s11846-022-00602-z
    DOI: 10.1007/s11846-022-00602-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11846-022-00602-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11846-022-00602-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Livio Cricelli & Michele Grimaldi & Silvia Vermicelli, 2022. "Crowdsourcing and open innovation: a systematic literature review, an integrated framework and a research agenda," Review of Managerial Science, Springer, vol. 16(5), pages 1269-1310, July.
    2. Pedro Jurado de los Santos & Antonio-José Moreno-Guerrero & José-Antonio Marín-Marín & Rebeca Soler Costa, 2020. "The Term Equity in Education: A Literature Review with Scientific Mapping in Web of Science," IJERPH, MDPI, vol. 17(10), pages 1-17, May.
    3. Roman Lukyanenko & Jeffrey Parsons & Binny M. Samuel, 2019. "Representing instances: the case for reengineering conceptual modelling grammars," European Journal of Information Systems, Taylor & Francis Journals, vol. 28(1), pages 68-90, January.
    4. Rey-Martí, Andrea & Ribeiro-Soriano, Domingo & Palacios-Marqués, Daniel, 2016. "A bibliometric analysis of social entrepreneurship," Journal of Business Research, Elsevier, vol. 69(5), pages 1651-1655.
    5. Raphael Silberzahn & Eric L. Uhlmann, 2015. "Crowdsourced research: Many hands make tight work," Nature, Nature, vol. 526(7572), pages 189-191, October.
    6. Sascha Friesike & Bastian Widenmayer & Oliver Gassmann & Thomas Schildhauer, 2015. "Opening science: towards an agenda of open science in academia and industry," The Journal of Technology Transfer, Springer, vol. 40(4), pages 581-601, August.
    7. Jennifer Edgar & Joe Murphy & Michael Keating, 2016. "Comparing Traditional and Crowdsourcing Methods for Pretesting Survey Questions," SAGE Open, , vol. 6(4), pages 21582440166, October.
    8. Sascha Kraus & Matthias Breier & Weng Marc Lim & Marina Dabić & Satish Kumar & Dominik Kanbach & Debmalya Mukherjee & Vincenzo Corvello & Juan Piñeiro-Chousa & Eric Liguori & Daniel Palacios-Marqués &, 2022. "Literature reviews as independent studies: guidelines for academic practice," Review of Managerial Science, Springer, vol. 16(8), pages 2577-2595, November.
    9. Roman Lukyanenko & Andrea Wiggins & Holly K. Rosser, 2020. "Citizen Science: An Information Quality Research Frontier," Information Systems Frontiers, Springer, vol. 22(4), pages 961-983, August.
    10. Greco, Marco & Grimaldi, Michele & Cricelli, Livio, 2016. "An analysis of the open innovation effect on firm performance," European Management Journal, Elsevier, vol. 34(5), pages 501-516.
    11. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    12. Palacios, Miguel & Martinez-Corral, Alberto & Nisar, Arsalan & Grijalvo, Mercedes, 2016. "Crowdsourcing and organizational forms: Emerging trends and research implications," Journal of Business Research, Elsevier, vol. 69(5), pages 1834-1839.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Marta Ortiz-de-Urbina-Criado & Juan-José Nájera-Sánchez & Eva-María Mora-Valentín, 2018. "A Research Agenda on Open Innovation and Entrepreneurship: A Co-Word Analysis," Administrative Sciences, MDPI, vol. 8(3), pages 1-17, July.
    2. Hyeon Jo & Youngsok Bang, 2023. "Factors influencing continuance intention of participants in crowdsourcing," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-13, December.
    3. Jonas Van Lancker & Erwin Wauters & Guido Van Huylenbroeck, 2019. "Open Innovation In Public Research Institutes — Success And Influencing Factors," International Journal of Innovation Management (ijim), World Scientific Publishing Co. Pte. Ltd., vol. 23(07), pages 1-37, October.
    4. Robbett, Andrea & Matthews, Peter Hans, 2018. "Partisan bias and expressive voting," Journal of Public Economics, Elsevier, vol. 157(C), pages 107-120.
    5. Inmaculada Buendía-Martínez & Inmaculada Carrasco Monteagudo, 2020. "The Role of CSR on Social Entrepreneurship: An International Analysis," Sustainability, MDPI, vol. 12(17), pages 1-22, August.
    6. Livio Cricelli & Michele Grimaldi & Silvia Vermicelli, 2022. "Crowdsourcing and open innovation: a systematic literature review, an integrated framework and a research agenda," Review of Managerial Science, Springer, vol. 16(5), pages 1269-1310, July.
    7. Mattozzi, Andrea & Snowberg, Erik, 2018. "The right type of legislator: A theory of taxation and representation," Journal of Public Economics, Elsevier, vol. 159(C), pages 54-65.
    8. Jasper Grashuis & Theodoros Skevas & Michelle S. Segovia, 2020. "Grocery Shopping Preferences during the COVID-19 Pandemic," Sustainability, MDPI, vol. 12(13), pages 1-10, July.
    9. Jeanette A.M.J. Deetlefs & Mathew Chylinski & Andreas Ortmann, 2015. "MTurk ‘Unscrubbed’: Exploring the good, the ‘Super’, and the unreliable on Amazon’s Mechanical Turk," Discussion Papers 2015-20, School of Economics, The University of New South Wales.
    10. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    11. John Hulland & Jeff Miller, 2018. "“Keep on Turkin’”?," Journal of the Academy of Marketing Science, Springer, vol. 46(5), pages 789-794, September.
    12. Nguyen Thi Canh & Nguyen Thanh Liem & Phung Anh Thu & Nguyen Vinh Khuong, 2019. "The Impact of Innovation on the Firm Performance and Corporate Social Responsibility of Vietnamese Manufacturing Firms," Sustainability, MDPI, vol. 11(13), pages 1-14, July.
    13. Kyungsik Han, 2018. "How do you perceive this author? Understanding and modeling authors’ communication quality in social media," PLOS ONE, Public Library of Science, vol. 13(2), pages 1-25, February.
    14. Mahavarpour, Nasrin & Marvi, Reza & Foroudi, Pantea, 2023. "A Brief History of Service Innovation: The evolution of past, present, and future of service innovation," Journal of Business Research, Elsevier, vol. 160(C).
    15. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    16. Barton, Jared & Pan, Xiaofei, 2022. "Movin’ on up? A survey experiment on mobility enhancing policies," European Journal of Political Economy, Elsevier, vol. 74(C).
    17. Huet-Vaughn, Emiliano & Robbett, Andrea & Spitzer, Matthew, 2019. "A taste for taxes: Minimizing distortions using political preferences," Journal of Public Economics, Elsevier, vol. 180(C).
    18. Holgersen, Henning & Jia, Zhiyang & Svenkerud, Simen, 2021. "Who and how many can work from home? Evidence from task descriptions," Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 55, pages 1-4.
    19. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    20. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:rvmgts:v:17:y:2023:i:8:d:10.1007_s11846-022-00602-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.