IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0222579.html
   My bibliography  Save this article

The CrowdWater game: A playful way to improve the accuracy of crowdsourced water level class data

Author

Listed:
  • Barbara Strobl
  • Simon Etter
  • Ilja van Meerveld
  • Jan Seibert

Abstract

Data quality control is important for any data collection program, especially in citizen science projects, where it is more likely that errors occur due to the human factor. Ideally, data quality control in citizen science projects is also crowdsourced so that it can handle large amounts of data. Here we present the CrowdWater game as a gamified method to check crowdsourced water level class data that are submitted by citizen scientists through the CrowdWater app. The app uses a virtual staff gauge approach, which means that a digital scale is added to the first picture taken at a site and this scale is used for water level class observations at different times. In the game, participants classify water levels based on the comparison of the new picture with the picture containing the virtual staff gauge. By March 2019, 153 people had played the CrowdWater game and 841 pictures were classified. The average water level for the game votes for the classified pictures was compared to the water level class submitted through the app to determine whether the game can improve the quality of the data submitted through the app. For about 70% of the classified pictures, the water level class was the same for the CrowdWater app and game. For a quarter of the classified pictures, there was disagreement between the value submitted through the app and the average game vote. Expert judgement suggests that for three quarters of these cases, the game based average value was correct. The initial results indicate that the CrowdWater game helps to identify erroneous water level class observations from the CrowdWater app and provides a useful approach for crowdsourced data quality control. This study thus demonstrates the potential of gamified approaches for data quality control in citizen science projects.

Suggested Citation

  • Barbara Strobl & Simon Etter & Ilja van Meerveld & Jan Seibert, 2019. "The CrowdWater game: A playful way to improve the accuracy of crowdsourced water level class data," PLOS ONE, Public Library of Science, vol. 14(9), pages 1-23, September.
  • Handle: RePEc:plo:pone00:0222579
    DOI: 10.1371/journal.pone.0222579
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0222579
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0222579&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0222579?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Sam Mavandadi & Stoyan Dimitrov & Steve Feng & Frank Yu & Uzair Sikora & Oguzhan Yaglidere & Swati Padmanabhan & Karin Nielsen & Aydogan Ozcan, 2012. "Distributed Medical Image Analysis and Diagnosis through Crowd-Sourced Games: A Malaria Case Study," PLOS ONE, Public Library of Science, vol. 7(5), pages 1-8, May.
    2. Julian Koch & Simon Stisen, 2017. "Citizen science: A new perspective to advance spatial pattern evaluation in hydrology," PLOS ONE, Public Library of Science, vol. 12(5), pages 1-20, May.
    3. Alexander Kawrykow & Gary Roumanis & Alfred Kam & Daniel Kwak & Clarence Leung & Chu Wu & Eleyine Zarour & Phylo players & Luis Sarmenta & Mathieu Blanchette & Jérôme Waldispühl, 2012. "Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment," PLOS ONE, Public Library of Science, vol. 7(3), pages 1-9, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matthew Staffelbach & Peter Sempolinski & Tracy Kijewski-Correa & Douglas Thain & Daniel Wei & Ahsan Kareem & Gregory Madey, 2015. "Lessons Learned from Crowdsourcing Complex Engineering Tasks," PLOS ONE, Public Library of Science, vol. 10(9), pages 1-19, September.
    2. Naihui Zhou & Zachary D Siegel & Scott Zarecor & Nigel Lee & Darwin A Campbell & Carson M Andorf & Dan Nettleton & Carolyn J Lawrence-Dill & Baskar Ganapathysubramanian & Jonathan W Kelly & Iddo Fried, 2018. "Crowdsourcing image analysis for plant phenomics to generate ground truth data for machine learning," PLOS Computational Biology, Public Library of Science, vol. 14(7), pages 1-16, July.
    3. Li, Xiaoou & Chen, Yunxiao & Chen, Xi & Liu, Jingchen & Ying, Zhiliang, 2021. "Optimal stopping and worker selection in crowdsourcing: an adaptive sequential probability ratio test framework," LSE Research Online Documents on Economics 100873, London School of Economics and Political Science, LSE Library.
    4. Juste Raimbault & Clémentine Cottineau & Marion Le Texier & Florent Le Nechet & Romain Reuillon, 2019. "Space Matters: Extending Sensitivity Analysis to Initial Spatial Conditions in Geosimulation Models," Journal of Artificial Societies and Social Simulation, Journal of Artificial Societies and Social Simulation, vol. 22(4), pages 1-10.
    5. Konstantinos Mitsakakis & Sebastian Hin & Pie Müller & Nadja Wipf & Edward Thomsen & Michael Coleman & Roland Zengerle & John Vontas & Konstantinos Mavridis, 2018. "Converging Human and Malaria Vector Diagnostics with Data Management towards an Integrated Holistic One Health Approach," IJERPH, MDPI, vol. 15(2), pages 1-26, February.
    6. Andrei P. Kirilenko & Travis Desell & Hany Kim & Svetlana Stepchenkova, 2017. "Crowdsourcing Analysis of Twitter Data on Climate Change: Paid Workers vs. Volunteers," Sustainability, MDPI, vol. 9(11), pages 1-15, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0222579. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.