IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v54y2016icp63-73.html
   My bibliography  Save this article

Crowdsourcing for quantifying transcripts: An exploratory study

Author

Listed:
  • Azzam, Tarek
  • Harman, Elena

Abstract

This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews.

Suggested Citation

  • Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
  • Handle: RePEc:eee:epplan:v:54:y:2016:i:c:p:63-73
    DOI: 10.1016/j.evalprogplan.2015.09.002
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718915001044
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2015.09.002?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    3. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Harman, Elena & Azzam, Tarek, 2018. "Incorporating public values into evaluative criteria: Using crowdsourcing to identify criteria and standards," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 68-82.
    2. Harman, Elena & Azzam, Tarek, 2018. "Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 183-194.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Masha Shunko & Julie Niederhoff & Yaroslav Rosokha, 2018. "Humans Are Not Machines: The Behavioral Impact of Queueing Design on Service Time," Management Science, INFORMS, vol. 64(1), pages 453-473, January.
    2. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    3. Jeanette A.M.J. Deetlefs & Mathew Chylinski & Andreas Ortmann, 2015. "MTurk ‘Unscrubbed’: Exploring the good, the ‘Super’, and the unreliable on Amazon’s Mechanical Turk," Discussion Papers 2015-20, School of Economics, The University of New South Wales.
    4. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    5. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    6. Atalay, Kadir & Bakhtiar, Fayzan & Cheung, Stephen & Slonim, Robert, 2014. "Savings and prize-linked savings accounts," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 86-106.
    7. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    8. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    9. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    10. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    11. Valerio Capraro & Hélène Barcelo, 2021. "Punishing defectors and rewarding cooperators: Do people discriminate between genders?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(1), pages 19-32, September.
    12. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    13. Jochen Becker & Josip Medjedovic & Christoph Merkle, 2019. "The Effect of CEO Extraversion on Analyst Forecasts: Stereotypes and Similarity Bias," The Financial Review, Eastern Finance Association, vol. 54(1), pages 133-164, February.
    14. Bidhan L. Parmar & Adrian Keevil & Andrew C. Wicks, 2019. "People and Profits: The Impact of Corporate Objectives on Employees’ Need Satisfaction at Work," Journal of Business Ethics, Springer, vol. 154(1), pages 13-33, January.
    15. Alexsandros Cavgias & Raphael Corbi, Luis Meloni, Lucas M. Novaes, 2019. "EDITED DEMOCRACY: Media Manipulation and the News Coverage of Presidential Debates," Working Papers, Department of Economics 2019_17, University of São Paulo (FEA-USP).
    16. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    17. Ola Andersson & Jim Ingebretsen Carlson & Erik Wengström, 2021. "Differences Attract: An Experimental Study of Focusing in Economic Choice," The Economic Journal, Royal Economic Society, vol. 131(639), pages 2671-2692.
    18. Tim Straub & Henner Gimpel & Florian Teschner & Christof Weinhardt, 2015. "How (not) to Incent Crowd Workers," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 57(3), pages 167-179, June.
    19. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    20. Chandler, Dana & Kapelner, Adam, 2013. "Breaking monotony with meaning: Motivation in crowdsourcing markets," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 123-133.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:54:y:2016:i:c:p:63-73. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.