IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0120521.html
   My bibliography  Save this article

Participation and Contribution in Crowdsourced Surveys

Author

Listed:
  • Robert Swain
  • Alex Berger
  • Josh Bongard
  • Paul Hines

Abstract

This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times.

Suggested Citation

  • Robert Swain & Alex Berger & Josh Bongard & Paul Hines, 2015. "Participation and Contribution in Crowdsourced Surveys," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-21, April.
  • Handle: RePEc:plo:pone00:0120521
    DOI: 10.1371/journal.pone.0120521
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0120521
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0120521&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0120521?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Seth Cooper & Firas Khatib & Adrien Treuille & Janos Barbero & Jeehyung Lee & Michael Beenen & Andrew Leaver-Fay & David Baker & Zoran Popović & Foldit players, 2010. "Predicting protein structures with a multiplayer online game," Nature, Nature, vol. 466(7307), pages 756-760, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christoph Safferling & Aaron Lowen, 2011. "Economics in the Kingdom of Loathing: Analysis of Virtual Market Data," Working Paper Series of the Department of Economics, University of Konstanz 2011-30, Department of Economics, University of Konstanz.
    2. Prpić, John & Shukla, Prashant P. & Kietzmann, Jan H. & McCarthy, Ian P., 2015. "How to work a crowd: Developing crowd capital through crowdsourcing," Business Horizons, Elsevier, vol. 58(1), pages 77-85.
    3. Kovacs, Attila, 2018. "Gender Differences in Equity Crowdfunding," OSF Preprints 5pcmb, Center for Open Science.
    4. Naihui Zhou & Zachary D Siegel & Scott Zarecor & Nigel Lee & Darwin A Campbell & Carson M Andorf & Dan Nettleton & Carolyn J Lawrence-Dill & Baskar Ganapathysubramanian & Jonathan W Kelly & Iddo Fried, 2018. "Crowdsourcing image analysis for plant phenomics to generate ground truth data for machine learning," PLOS Computational Biology, Public Library of Science, vol. 14(7), pages 1-16, July.
    5. Spartaco Albertarelli & Piero Fraternali & Sergio Herrera & Mark Melenhorst & Jasminko Novak & Chiara Pasini & Andrea-Emilio Rizzoli & Cristina Rottondi, 2018. "A Survey on the Design of Gamified Systems for Energy and Water Sustainability," Games, MDPI, vol. 9(3), pages 1-34, June.
    6. Franzoni, Chiara & Sauermann, Henry, 2014. "Crowd science: The organization of scientific research in open collaborative projects," Research Policy, Elsevier, vol. 43(1), pages 1-20.
    7. Sam Mavandadi & Stoyan Dimitrov & Steve Feng & Frank Yu & Uzair Sikora & Oguzhan Yaglidere & Swati Padmanabhan & Karin Nielsen & Aydogan Ozcan, 2012. "Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study," PLOS ONE, Public Library of Science, vol. 7(5), pages 1-8, May.
    8. Sherwani, Y & Ahmed, M & Muntasir, M & El-Hilly, A & Iqbal, S & Siddiqui, S & Al-Fagih, Z & Usmani, O & Eisingerich, AB, 2015. "Examining the role of gamification and use of mHealth apps in the context of smoking cessation: A review of extant knowledge and outlook," Working Papers 25458, Imperial College, London, Imperial College Business School.
    9. Joanna Chataway & Sarah Parks & Elta Smith, 2017. "How Will Open Science Impact on University-Industry Collaboration?," Foresight and STI Governance (Foresight-Russia till No. 3/2015), National Research University Higher School of Economics, vol. 11(2), pages 44-53.
    10. Ayat Abourashed & Laura Doornekamp & Santi Escartin & Constantianus J. M. Koenraadt & Maarten Schrama & Marlies Wagener & Frederic Bartumeus & Eric C. M. van Gorp, 2021. "The Potential Role of School Citizen Science Programs in Infectious Disease Surveillance: A Critical Review," IJERPH, MDPI, vol. 18(13), pages 1-18, June.
    11. Jennifer Lewis Priestley & Robert J. McGrath, 2019. "The Evolution of Data Science: A New Mode of Knowledge Production," International Journal of Knowledge Management (IJKM), IGI Global, vol. 15(2), pages 97-109, April.
    12. Vito D’Orazio & Michael Kenwick & Matthew Lane & Glenn Palmer & David Reitter, 2016. "Crowdsourcing the Measurement of Interstate Conflict," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-21, June.
    13. Yury Kryvasheyeu & Haohui Chen & Esteban Moro & Pascal Van Hentenryck & Manuel Cebrian, 2015. "Performance of Social Network Sensors during Hurricane Sandy," PLOS ONE, Public Library of Science, vol. 10(2), pages 1-19, February.
    14. Prpić, John, 2017. "How To Work A Crowd: Developing Crowd Capital Through Crowdsourcing," SocArXiv jer9k, Center for Open Science.
    15. Siluo Yang & Dietmar Wolfram & Feifei Wang, 2017. "The relationship between the author byline and contribution lists: a comparison of three general medical journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1273-1296, March.
    16. Maryam Lotfian & Jens Ingensand & Maria Antonia Brovelli, 2021. "The Partnership of Citizen Science and Machine Learning: Benefits, Risks, and Future Challenges for Engagement, Data Collection, and Data Quality," Sustainability, MDPI, vol. 13(14), pages 1-19, July.
    17. Jonathan R Karr & Alex H Williams & Jeremy D Zucker & Andreas Raue & Bernhard Steiert & Jens Timmer & Clemens Kreutz & DREAM8 Parameter Estimation Challenge Consortium & Simon Wilkinson & Brandon A Al, 2015. "Summary of the DREAM8 Parameter Estimation Challenge: Toward Parameter Identification for Whole-Cell Models," PLOS Computational Biology, Public Library of Science, vol. 11(5), pages 1-21, May.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0120521. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.