IDEAS home Printed from https://ideas.repec.org/a/inm/orisre/v29y2018i2p401-418.html
   My bibliography  Save this article

Salience Bias in Crowdsourcing Contests

Author

Listed:
  • Ho Cheung Brian Lee

    (Manning School of Business, University of Massachusetts Lowell, Lowell, Massachusetts 01854)

  • Sulin Ba

    (School of Business, University of Connecticut, Storrs, Connecticut 06269)

  • Xinxin Li

    (School of Business, University of Connecticut, Storrs, Connecticut 06269)

  • Jan Stallaert

    (School of Business, University of Connecticut, Storrs, Connecticut 06269)

Abstract

Crowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform’s designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platform Kaggle, combined with survey data from actual Kaggle contest participants, we examine the role of a systematic bias, namely, the salience bias, in influencing the performance of the crowdsourcing workers and how the number of crowdsourcing workers moderates the impact of the salience bias on the outcomes of contests. Our results suggest that the salience bias influences the performance of contestants, including the winners of the contests. Furthermore, the number of participating contestants may attenuate or amplify the impact of the salience bias on the outcomes of contests, depending on the effort required to complete the tasks. Our results have critical implications for crowdsourcing firms and platform designers. The online appendix is available at https://doi.org/10.1287/isre.2018.0775 .

Suggested Citation

  • Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
  • Handle: RePEc:inm:orisre:v:29:y:2018:i:2:p:401-418
    DOI: isre.2018.0775
    as

    Download full text from publisher

    File URL: https://doi.org/isre.2018.0775
    Download Restriction: no

    File URL: https://libkey.io/isre.2018.0775?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kevin J. Boudreau & Karim R. Lakhani & Michael Menietti, 2016. "Performance responses to competition across skill levels in rank-order tournaments: field evidence and implications for tournament design," RAND Journal of Economics, RAND Corporation, vol. 47(1), pages 140-165, February.
    2. John A. List, 2011. "Does Market Experience Eliminate Market Anomalies? The Case of Exogenous Market Experience," American Economic Review, American Economic Association, vol. 101(3), pages 313-317, May.
    3. Jean‐Charles Rochet & Jean Tirole, 2006. "Two‐sided markets: a progress report," RAND Journal of Economics, RAND Corporation, vol. 37(3), pages 645-667, September.
    4. Youngjin Yoo & Richard J. Boland & Kalle Lyytinen & Ann Majchrzak, 2012. "Organizing for Innovation in the Digitized World," Organization Science, INFORMS, vol. 23(5), pages 1398-1408, October.
    5. Nuno Camacho & Bas Donkers & Stefan Stremersch, 2011. "Predictably Non-Bayesian: Quantifying Salience Effects in Physician Learning About Drug Quality," Marketing Science, INFORMS, vol. 30(2), pages 305-320, 03-04.
    6. Geoffrey G. Parker & Marshall W. Van Alstyne, 2005. "Two-Sided Network Effects: A Theory of Information Product Design," Management Science, INFORMS, vol. 51(10), pages 1494-1504, October.
    7. Chrysanthos Dellarocas, 2003. "The Digitization of Word of Mouth: Promise and Challenges of Online Feedback Mechanisms," Management Science, INFORMS, vol. 49(10), pages 1407-1424, October.
    8. Mingfeng Lin & Siva Viswanathan, 2016. "Home Bias in Online Investments: An Empirical Study of an Online Crowdfunding Market," Management Science, INFORMS, vol. 62(5), pages 1393-1414, May.
    9. Christian Terwiesch & Yi Xu, 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science, INFORMS, vol. 54(9), pages 1529-1543, September.
    10. Devin G. Pope & Maurice E. Schweitzer, 2011. "Is Tiger Woods Loss Averse? Persistent Bias in the Face of Experience, Competition, and High Stakes," American Economic Review, American Economic Association, vol. 101(1), pages 129-157, February.
    11. Ivo Blohm & Christoph Riedl & Johann Füller & Jan Marco Leimeister, 2016. "Rate or Trade? Identifying Winning Ideas in Open Idea Sourcing," Information Systems Research, INFORMS, vol. 27(1), pages 27-48, March.
    12. Yan Huang & Param Vir Singh & Kannan Srinivasan, 2014. "Crowdsourcing New Product Ideas Under Consumer Learning," Management Science, INFORMS, vol. 60(9), pages 2138-2159, September.
    13. John A. List, 2003. "Does Market Experience Eliminate Market Anomalies?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 118(1), pages 41-71.
    14. Kevin J. Boudreau & Nicola Lacetera & Karim R. Lakhani, 2011. "Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis," Management Science, INFORMS, vol. 57(5), pages 843-863, May.
    15. Lars Bo Jeppesen & Karim R. Lakhani, 2010. "Marginality and Problem-Solving Effectiveness in Broadcast Search," Organization Science, INFORMS, vol. 21(5), pages 1016-1033, October.
    16. Jonathan E. Alevy & Michael S. Haigh & John A. List, 2007. "Information Cascades: Evidence from a Field Experiment with Financial Market Professionals," Journal of Finance, American Finance Association, vol. 62(1), pages 151-180, February.
    17. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2016. "Heterogeneous Submission Behavior and its Implications for Success in Innovation Contests with Public Submissions," Production and Operations Management, Production and Operations Management Society, vol. 25(7), pages 1157-1176, July.
    18. Sulin Ba & Jan Stallaert & Andrew B. Whinston, 2001. "Research Commentary: Introducing a Third Dimension in Information Systems Design—The Case for Incentive Alignment," Information Systems Research, INFORMS, vol. 12(3), pages 225-239, September.
    19. Tracy Xiao Liu & Jiang Yang & Lada A. Adamic & Yan Chen, 2014. "Crowdsourcing with All-Pay Auctions: A Field Experiment on Taskcn," Management Science, INFORMS, vol. 60(8), pages 2020-2037, August.
    20. Katz, Michael L & Shapiro, Carl, 1985. "Network Externalities, Competition, and Compatibility," American Economic Review, American Economic Association, vol. 75(3), pages 424-440, June.
    21. Jonah Berger & Devin Pope, 2011. "Can Losing Lead to Winning?," Management Science, INFORMS, vol. 57(5), pages 817-827, May.
    22. Dellarocas, Chrysanthos, 2003. "The Digitization of Word-of-mouth: Promise and Challenges of Online Feedback Mechanisms," Working papers 4296-03, Massachusetts Institute of Technology (MIT), Sloan School of Management.
    23. Joel O. Wooten & Karl T. Ulrich, 2017. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management, Production and Operations Management Society, vol. 26(1), pages 80-99, January.
    24. Verena Tiefenbeck & Lorenz Goette & Kathrin Degen & Vojkan Tasic & Elgar Fleisch & Rafael Lalive & Thorsten Staake, 2018. "Overcoming Salience Bias: How Real-Time Feedback Fosters Resource Conservation," Management Science, INFORMS, vol. 64(3), pages 1458-1476, March.
    25. Barry L. Bayus, 2013. "Crowdsourcing New Product Ideas over Time: An Analysis of the Dell IdeaStorm Community," Management Science, INFORMS, vol. 59(1), pages 226-244, June.
    26. Nils Rudi & David Drake, 2014. "Observation Bias: The Impact of Demand Censoring on Newsvendor Level and Adjustment Behavior," Management Science, INFORMS, vol. 60(5), pages 1334-1345, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
    2. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    3. Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.
    4. Madanaguli, Arun & Dhir, Amandeep & Talwar, Shalini & Clauss, Thomas & Kraus, Sascha & Kaur, Puneet, 2023. "Diving into the uncertainties of open innovation: A systematic review of risks to uncover pertinent typologies and unexplored horizons," Technovation, Elsevier, vol. 119(C).
    5. Xu, Yanjing & Zhu, Jianming & Mou, Jian, 2021. "Factors influencing bid-winning performance in mixed crowdsourcing: The persuasive effect of credible information sources," Technology in Society, Elsevier, vol. 65(C).
    6. Panos Constantinides & Ola Henfridsson & Geoffrey G. Parker, 2018. "Introduction—Platforms and Infrastructures in the Digital Age," Information Systems Research, INFORMS, vol. 29(2), pages 381-400, June.
    7. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hu, Feng & Bijmolt, Tammo H.A. & Huizingh, Eelko K.R.E., 2020. "The impact of innovation contest briefs on the quality of solvers and solutions," Technovation, Elsevier, vol. 90.
    2. Ying-Ju Chen & Tinglong Dai & C. Gizem Korpeoglu & Ersin Körpeoğlu & Ozge Sahin & Christopher S. Tang & Shihong Xiao, 2020. "OM Forum—Innovative Online Platforms: Research Opportunities," Manufacturing & Service Operations Management, INFORMS, vol. 22(3), pages 430-445, May.
    3. Patel, Chirag & Ahmad Husairi, Mariyani & Haon, Christophe & Oberoi, Poonam, 2023. "Monetary rewards and self-selection in design crowdsourcing contests: Managing participation, contribution appropriateness, and winning trade-offs," Technological Forecasting and Social Change, Elsevier, vol. 191(C).
    4. Jiao, Yuanyuan & Wu, Yepeng & Lu, Steven, 2021. "The role of crowdsourcing in product design: The moderating effect of user expertise and network connectivity," Technology in Society, Elsevier, vol. 64(C).
    5. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    6. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    7. Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
    8. Christoph Riedl & Victor P. Seidel, 2018. "Learning from Mixed Signals in Online Innovation Communities," Organization Science, INFORMS, vol. 29(6), pages 1010-1032, December.
    9. Shunyuan Zhang & Param Vir Singh & Anindya Ghose, 2019. "A Structural Analysis of the Role of Superstars in Crowdsourcing Contests," Service Science, INFORMS, vol. 30(1), pages 15-33, March.
    10. Cheng, Xi & Gou, Qinglong & Yue, Jinfeng & Zhang, Yan, 2019. "Equilibrium decisions for an innovation crowdsourcing platform," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 125(C), pages 241-260.
    11. Yan Chen & Peter Cramton & John A. List & Axel Ockenfels, 2021. "Market Design, Human Behavior, and Management," Management Science, INFORMS, vol. 67(9), pages 5317-5348, September.
    12. Dargahi, Rambod & Namin, Aidin & Ketron, Seth C. & Saint Clair, Julian K., 2021. "Is self-knowledge the ultimate prize? A quantitative analysis of participation choice in online ideation crowdsourcing contests," Journal of Retailing and Consumer Services, Elsevier, vol. 62(C).
    13. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    14. Joel O. Wooten, 2022. "Leaps in innovation and the Bannister effect in contests," Production and Operations Management, Production and Operations Management Society, vol. 31(6), pages 2646-2663, June.
    15. Segev, Ella, 2020. "Crowdsourcing contests," European Journal of Operational Research, Elsevier, vol. 281(2), pages 241-255.
    16. Juncai Jiang & Yu Wang, 2020. "A Theoretical and Empirical Investigation of Feedback in Ideation Contests," Production and Operations Management, Production and Operations Management Society, vol. 29(2), pages 481-500, February.
    17. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    18. repec:eee:respol:v:48:y:2019:i:8:p:- is not listed on IDEAS
    19. Elina H. Hwang & Param Vir Singh & Linda Argote, 2019. "Jack of All, Master of Some: Information Network and Innovation in Crowdsourcing Communities," Information Systems Research, INFORMS, vol. 30(2), pages 389-410, June.
    20. Sulin Ba & Barrie R. Nault, 2017. "Emergent Themes in the Interface Between Economics of Information Systems and Management of Technology," Production and Operations Management, Production and Operations Management Society, vol. 26(4), pages 652-666, April.
    21. Dahlander, Linus & Beretta, Michela & Thomas, Arne & Kazemi, Shahab & Fenger, Morten H.J. & Frederiksen, Lars, 2023. "Weeding out or picking winners in open innovation? Factors driving multi-stage crowd selection on LEGO ideas," Research Policy, Elsevier, vol. 52(10).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:29:y:2018:i:2:p:401-418. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.