IDEAS home Printed from https://ideas.repec.org/a/eee/beexfi/v17y2018icp22-27.html
   My bibliography  Save this article

Prolific.ac—A subject pool for online experiments

Author

Listed:
  • Palan, Stefan
  • Schitter, Christian

Abstract

The number of online experiments conducted with subjects recruited via online platforms has grown considerably in the recent past. While one commercial crowdworking platform – Amazon’s Mechanical Turk – basically has established and since dominated this field, new alternatives offer services explicitly targeted at researchers. In this article, we present www.prolific.ac and lay out its suitability for recruiting subjects for social and economic science experiments. After briefly discussing key advantages and challenges of online experiments relative to lab experiments, we trace the platform’s historical development, present its features, and contrast them with requirements for different types of social and economic experiments.

Suggested Citation

  • Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
  • Handle: RePEc:eee:beexfi:v:17:y:2018:i:c:p:22-27
    DOI: 10.1016/j.jbef.2017.12.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S2214635017300989
    Download Restriction: no

    File URL: https://libkey.io/10.1016/j.jbef.2017.12.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Benndorf, Volker & Moellers, Claudia & Normann, Hans-Theo, 2017. "Experienced vs. inexperienced participants in the lab: Do they behave differently?," DICE Discussion Papers 251, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    2. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    3. Landers, Richard N. & Behrend, Tara S., 2015. "An Inconvenient Truth: Arbitrary Distinctions Between Organizational, Mechanical Turk, and Other Convenience Samples," Industrial and Organizational Psychology, Cambridge University Press, vol. 8(2), pages 142-164, June.
    4. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    5. Urs Fischbacher & Franziska Föllmi-Heusi, 2013. "Lies In Disguise—An Experimental Study On Cheating," Journal of the European Economic Association, European Economic Association, vol. 11(3), pages 525-547, June.
    6. Johannes Abeler & Armin Falk & Lorenz Goette & David Huffman, 2011. "Reference Points and Effort Provision," American Economic Review, American Economic Association, vol. 101(2), pages 470-492, April.
    7. Ofra Amir & David G Rand & Ya'akov Kobi Gal, 2012. "Economic Games on the Internet: The Effect of $1 Stakes," PLOS ONE, Public Library of Science, vol. 7(2), pages 1-4, February.
    8. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, "undated". "The Average Laboratory Samples a Population of 7,300 Amazon Mechanical Turk Workers," Mathematica Policy Research Reports f97b669c7b3e4c2ab95c9f805, Mathematica Policy Research.
    9. Ben Greiner, 2015. "Subject pool recruitment procedures: organizing experiments with ORSEE," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(1), pages 114-125, July.
    10. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 10(5), pages 479-491, September.
    11. Matthew J C Crump & John V McDonnell & Todd M Gureckis, 2013. "Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-18, March.
    12. Gabriele Paolacci & Jesse Chandler & Panagiotis G. Ipeirotis, 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(5), pages 411-419, August.
    13. Volker Benndorf & Claudia Moellers & Hans-Theo Normann, 2017. "Experienced vs. inexperienced participants in the lab: do they behave differently?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 3(1), pages 12-25, July.
    14. Marreiros, Helia & Tonin, Mirco & Vlassopoulos, Michael & Schraefel, M.C., 2017. "“Now that you mention it”: A survey experiment on information, inattention and online privacy," Journal of Economic Behavior & Organization, Elsevier, vol. 140(C), pages 1-17.
    15. Siddharth Suri & Duncan J Watts, 2011. "Cooperation and Contagion in Web-Based, Networked Public Goods Experiments," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-18, March.
    16. Cooper, David J., 2014. "A Note on Deception in Economic Experiments," Journal of Wine Economics, Cambridge University Press, vol. 9(02), pages 111-114, August.
    17. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    2. Hyndman, Kyle & Walker, Matthew J., 2022. "Fairness and risk in ultimatum bargaining," Games and Economic Behavior, Elsevier, vol. 132(C), pages 90-105.
    3. Capraro, Valerio & Schulz, Jonathan & Rand, David G., 2019. "Time pressure and honesty in a deception game," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 79(C), pages 93-99.
    4. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    5. Benndorf, Volker & Rau, Holger A. & Sölch, Christian, 2019. "Minimizing learning in repeated real-effort tasks," Journal of Behavioral and Experimental Finance, Elsevier, vol. 22(C), pages 239-248.
    6. Marcus Giamattei & Kyanoush Seyed Yahosseini & Simon Gächter & Lucas Molleman, 2020. "LIONESS Lab: a free web-based platform for conducting interactive experiments online," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 95-111, June.
    7. Irene Maria Buso & Daniela Di Cagno & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2021. "Lab-like findings from online experiments," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 184-193, December.
    8. Yamada, Katsunori & Sato, Masayuki, 2013. "Another avenue for anatomy of income comparisons: Evidence from hypothetical choice experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 89(C), pages 35-57.
    9. Brañas-Garza, Pablo & Capraro, Valerio & Rascón-Ramírez, Ericka, 2018. "Gender differences in altruism on Mechanical Turk: Expectations and actual behaviour," Economics Letters, Elsevier, vol. 170(C), pages 19-23.
    10. Van Borm, Hannah & Burn, Ian & Baert, Stijn, 2021. "What Does a Job Candidate's Age Signal to Employers?," Labour Economics, Elsevier, vol. 71(C).
    11. Jillian J Jordan & David G Rand & Samuel Arbesman & James H Fowler & Nicholas A Christakis, 2013. "Contagion of Cooperation in Static and Fluid Social Networks," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-10, June.
    12. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    13. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    14. Chapkovski, Philipp, 2022. "Interactive Experiments in Toloka," EconStor Preprints 249771, ZBW - Leibniz Information Centre for Economics.
    15. Garbarino, Ellen & Slonim, Robert & Villeval, Marie Claire, 2019. "Loss aversion and lying behavior," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 379-393.
    16. Eriksen, Kristoffer W. & Fest, Sebastian & Kvaløy, Ola & Dijk, Oege, 2022. "Fair advice," Journal of Banking & Finance, Elsevier, vol. 143(C).
    17. Chapkovski, Philipp, 2022. "Interactive experiments in Toloka," MPRA Paper 111980, University Library of Munich, Germany.
    18. Manapat, Michael L. & Nowak, Martin A. & Rand, David G., 2013. "Information, irrationality, and the evolution of trust," Journal of Economic Behavior & Organization, Elsevier, vol. 90(S), pages 57-75.
    19. Burdea, Valeria & Woon, Jonathan, 2022. "Online belief elicitation methods," Journal of Economic Psychology, Elsevier, vol. 90(C).
    20. Bortolotti, Stefania & Kölle, Felix & Wenner, Lukas, 2022. "On the persistence of dishonesty," Journal of Economic Behavior & Organization, Elsevier, vol. 200(C), pages 1053-1065.

    More about this item

    Keywords

    Prolific; Online experiment; Subject pool;
    All these keywords.

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:beexfi:v:17:y:2018:i:c:p:22-27. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://www.journals.elsevier.com/journal-of-behavioral-and-experimental-finance .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/journal-of-behavioral-and-experimental-finance .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.