IDEAS home Printed from https://ideas.repec.org/a/inm/orisre/v32y2021i3p836-859.html
   My bibliography  Save this article

Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests

Author

Listed:
  • Yuan Jin

    (Area of Information Systems and Quantitative Sciences, Rawls College of Business, Texas Tech University, Lubbock, Texas 79409)

  • Ho Cheung Brian Lee

    (Department of Supply Chain and Information Systems, Smeal College of Business, Pennsylvania State University, University Park, Pennsylvania 16802)

  • Sulin Ba

    (Department of Operations and Information Management, School of Business, University of Connecticut, Storrs, Connecticut 06269)

  • Jan Stallaert

    (Department of Operations and Information Management, School of Business, University of Connecticut, Storrs, Connecticut 06269)

Abstract

A crowdsourcing contest connects solution seekers to online users who compete with each other to solve the seeker’s problem by generating innovative ideas. Knowledge sharing that occurs in such a contest may play an important role in the process of contestants generating high-quality solutions. On the one hand, more knowledge resources may lower the participation cost and help improve crowdsourcing performance. On the other hand, the shared knowledge may also interrupt contestants’ independent solution search processes and distract contestants. This study demonstrates the existence of knowledge sharing’s impact on crowdsourcing contestants’ performance and identifies the influence of different shared knowledge dimensions on crowdsourcing contestants. The results indicate that having a knowledge sharing process on the platform does not necessarily improve crowdsourcing contestants’ performance. We show that the effectiveness of knowledge sharing is influenced by the volume, quality, and generativity of shared knowledge. The shared knowledge is only beneficial when it is of high quality or of high generativity. In addition, we examine the effects of the breadth and depth of knowledge generativity in the knowledge sharing process and find that a high degree of derivation breadth improves contestants’ performance. The findings provide implications for a crowdsourcing contest platform to utilize the knowledge sharing feature effectively. The key to making full use of this feature is to ensure a high quality of the shared knowledge and to encourage more contributions of generative knowledge, especially the generative knowledge of great breadth.

Suggested Citation

  • Yuan Jin & Ho Cheung Brian Lee & Sulin Ba & Jan Stallaert, 2021. "Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 32(3), pages 836-859, September.
  • Handle: RePEc:inm:orisre:v:32:y:2021:i:3:p:836-859
    DOI: 10.1287/isre.2020.0982
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/isre.2020.0982
    Download Restriction: no

    File URL: https://libkey.io/10.1287/isre.2020.0982?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kane, Aimee A. & Argote, Linda & Levine, John M., 2005. "Knowledge transfer between groups via personnel rotation: Effects of social identity and knowledge quality," Organizational Behavior and Human Decision Processes, Elsevier, vol. 96(1), pages 56-71, January.
    2. Garcia Martinez, Marian, 2015. "Solver engagement in knowledge sharing in crowdsourcing communities: Exploring the link to creativity," Research Policy, Elsevier, vol. 44(8), pages 1419-1430.
    3. Morten T. Hansen, 2002. "Knowledge Networks: Explaining Effective Knowledge Sharing in Multiunit Companies," Organization Science, INFORMS, vol. 13(3), pages 232-248, June.
    4. Elina H. Hwang & Param Vir Singh & Linda Argote, 2015. "Knowledge Sharing in Online Communities: Learning to Cross Geographic and Hierarchical Boundaries," Organization Science, INFORMS, vol. 26(6), pages 1593-1611, December.
    5. Jie Lou & Yulin Fang & Kai H. Lim & Jerry Zeyu Peng, 2013. "Contributing high quantity and quality knowledge to online Q&A communities," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 356-371, February.
    6. Kevin J. Boudreau & Karim R. Lakhani & Michael Menietti, 2016. "Performance responses to competition across skill levels in rank-order tournaments: field evidence and implications for tournament design," RAND Journal of Economics, RAND Corporation, vol. 47(1), pages 140-165, February.
    7. David H. Autor, 2003. "Outsourcing at Will: The Contribution of Unjust Dismissal Doctrine to the Growth of Employment Outsourcing," Journal of Labor Economics, University of Chicago Press, vol. 21(1), pages 1-42, January.
    8. Weiyin Hong & James Y. L. Thong & Kar Yan Tam, 2004. "Does Animation Attract Online Users’ Attention? The Effects of Flash on Information Search Performance and Perceptions," Information Systems Research, INFORMS, vol. 15(1), pages 60-86, March.
    9. Martine R. Haas & Morten T. Hansen, 2007. "Different knowledge, different benefits: toward a productivity perspective on knowledge sharing in organizations," Strategic Management Journal, Wiley Blackwell, vol. 28(11), pages 1133-1153, November.
    10. Joel O. Wooten & Karl T. Ulrich, 2017. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management, Production and Operations Management Society, vol. 26(1), pages 80-99, January.
    11. Marianne Bertrand & Esther Duflo & Sendhil Mullainathan, 2004. "How Much Should We Trust Differences-In-Differences Estimates?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 119(1), pages 249-275.
    12. Elina H. Hwang & Param Vir Singh & Linda Argote, 2019. "Jack of All, Master of Some: Information Network and Innovation in Crowdsourcing Communities," Information Systems Research, INFORMS, vol. 30(2), pages 389-410, June.
    13. James G. March, 1991. "Exploration and Exploitation in Organizational Learning," Organization Science, INFORMS, vol. 2(1), pages 71-87, February.
    14. Ann Majchrzak & Arvind Malhotra, 2016. "Effect of Knowledge-Sharing Trajectories on Innovative Outcomes in Temporary Online Crowds," Information Systems Research, INFORMS, vol. 27(4), pages 685-703, December.
    15. Zhang, Yixiang & Fang, Yulin & Wei, Kwok-Kee & Chen, Huaping, 2010. "Exploring the role of psychological safety in promoting the intention to continue sharing knowledge in virtual communities," International Journal of Information Management, Elsevier, vol. 30(5), pages 425-436.
    16. Christian Terwiesch & Yi Xu, 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science, INFORMS, vol. 54(9), pages 1529-1543, September.
    17. Jie Lou & Yulin Fang & Kai H. Lim & Jerry Zeyu Peng, 2013. "Contributing high quantity and quality knowledge to online Q&A communities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 356-371, February.
    18. Yan Huang & Param Vir Singh & Kannan Srinivasan, 2014. "Crowdsourcing New Product Ideas Under Consumer Learning," Management Science, INFORMS, vol. 60(9), pages 2138-2159, September.
    19. Kevin J. Boudreau & Nicola Lacetera & Karim R. Lakhani, 2011. "Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis," Management Science, INFORMS, vol. 57(5), pages 843-863, May.
    20. Lars Bo Jeppesen & Karim R. Lakhani, 2010. "Marginality and Problem-Solving Effectiveness in Broadcast Search," Organization Science, INFORMS, vol. 21(5), pages 1016-1033, October.
    21. Dong-Gil Ko & Alan R. Dennis, 2011. "Profiting from Knowledge Management: The Impact of Time and Experience," Information Systems Research, INFORMS, vol. 22(1), pages 134-152, March.
    22. Michiel De Boer & Frans A. J. Van Den Bosch & Henk W. Volberda, 1999. "Managing Organizational Knowledge Integration in the Emerging Multimedia Complex," Journal of Management Studies, Wiley Blackwell, vol. 36(3), pages 379-398, May.
    23. Jeffrey H. Dyer & Kentaro Nobeoka, 2000. "Creating and managing a high‐performance knowledge‐sharing network: the Toyota case," Strategic Management Journal, Wiley Blackwell, vol. 21(3), pages 345-367, March.
    24. John A. List & Daan van Soest & Jan Stoop & Haiwen Zhou, 2020. "On the Role of Group Size in Tournaments: Theory and Evidence from Laboratory and Field Experiments," Management Science, INFORMS, vol. 66(10), pages 4359-4377, October.
    25. Gordon Burtch & Seth Carnahan & Brad N. Greenwood, 2018. "Can You Gig It? An Empirical Examination of the Gig Economy and Entrepreneurial Activity," Management Science, INFORMS, vol. 64(12), pages 5497-5520, December.
    26. Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hyeon Jo & Youngsok Bang, 2023. "RETRACTED ARTICLE: Factors influencing continuance intention of participants in crowdsourcing," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-13, December.
    2. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    3. Pallab Sanyal & Shun Ye, 2024. "An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type," Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
    4. Li, Libo & Yu, Huan & Kunc, Martin, 2024. "The impact of forum content on data science open innovation performance: A system dynamics-based causal machine learning approach," Technological Forecasting and Social Change, Elsevier, vol. 198(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christoph Riedl & Victor P. Seidel, 2018. "Learning from Mixed Signals in Online Innovation Communities," Organization Science, INFORMS, vol. 29(6), pages 1010-1032, December.
    2. Jiao, Yuanyuan & Wu, Yepeng & Lu, Steven, 2021. "The role of crowdsourcing in product design: The moderating effect of user expertise and network connectivity," Technology in Society, Elsevier, vol. 64(C).
    3. Ho Cheung Brian Lee & Sulin Ba & Xinxin Li & Jan Stallaert, 2018. "Salience Bias in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 29(2), pages 401-418, June.
    4. Swanand J. Deodhar & Samrat Gupta, 2023. "The Impact of Social Reputation Features in Innovation Tournaments: Evidence from a Natural Experiment," Information Systems Research, INFORMS, vol. 34(1), pages 178-193, March.
    5. Jesse Bockstedt & Cheryl Druehl & Anant Mishra, 2022. "Incentives and Stars: Competition in Innovation Contests with Participant and Submission Visibility," Production and Operations Management, Production and Operations Management Society, vol. 31(3), pages 1372-1393, March.
    6. Ma, Danni & Fee, Anthony & Grabowski, Simone & Scerri, Moira, 2022. "Dual Organizational Identification in Multinational Enterprises and Interpersonal Horizontal Knowledge Sharing: A Conceptual Model," Journal of International Management, Elsevier, vol. 28(1).
    7. repec:eee:respol:v:48:y:2019:i:8:p:- is not listed on IDEAS
    8. Nirup Menon & Anant Mishra & Shun Ye, 2020. "Beyond Related Experience: Upstream vs. Downstream Experience in Innovation Contest Platforms with Interdependent Problem Domains," Manufacturing & Service Operations Management, INFORMS, vol. 22(5), pages 1045-1065, September.
    9. Tat Koon Koh & Muller Y. M. Cheung, 2022. "Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing Contests," Information Systems Research, INFORMS, vol. 33(1), pages 265-284, March.
    10. Pallab Sanyal & Shun Ye, 2024. "An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type," Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
    11. Tat Koon Koh, 2019. "Adopting Seekers’ Solution Exemplars in Crowdsourcing Ideation Contests: Antecedents and Consequences," Information Systems Research, INFORMS, vol. 30(2), pages 486-506, June.
    12. Joel O. Wooten, 2022. "Leaps in innovation and the Bannister effect in contests," Production and Operations Management, Production and Operations Management Society, vol. 31(6), pages 2646-2663, June.
    13. Salgado, Stéphane & Hemonnet-Goujot, Aurelie & Henard, David H. & de Barnier, Virginie, 2020. "The dynamics of innovation contest experience: An integrated framework from the customer’s perspective," Journal of Business Research, Elsevier, vol. 117(C), pages 29-43.
    14. Daniel P. Gross, 2020. "Creativity Under Fire: The Effects of Competition on Creative Production," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 583-599, July.
    15. Pollok, Patrick & Lüttgens, Dirk & Piller, Frank T., 2019. "Attracting solutions in crowdsourcing contests: The role of knowledge distance, identity disclosure, and seeker status," Research Policy, Elsevier, vol. 48(1), pages 98-114.
    16. Linda Argote & Sunkee Lee & Jisoo Park, 2021. "Organizational Learning Processes and Outcomes: Major Findings and Future Research Directions," Management Science, INFORMS, vol. 67(9), pages 5399-5429, September.
    17. Hu, Feng & Bijmolt, Tammo H.A. & Huizingh, Eelko K.R.E., 2020. "The impact of innovation contest briefs on the quality of solvers and solutions," Technovation, Elsevier, vol. 90.
    18. Laurence Ales & Soo‐Haeng Cho & Ersin Körpeoğlu, 2021. "Innovation Tournaments with Multiple Contributors," Production and Operations Management, Production and Operations Management Society, vol. 30(6), pages 1772-1784, June.
    19. Shunyuan Zhang & Param Vir Singh & Anindya Ghose, 2019. "A Structural Analysis of the Role of Superstars in Crowdsourcing Contests," Service Science, INFORMS, vol. 30(1), pages 15-33, March.
    20. Christoph Riedl & Tom Grad & Christopher Lettl, 2024. "Competition and Collaboration in Crowdsourcing Communities: What happens when peers evaluate each other?," Papers 2404.14141, arXiv.org.
    21. Elina H. Hwang & Param Vir Singh & Linda Argote, 2019. "Jack of All, Master of Some: Information Network and Innovation in Crowdsourcing Communities," Information Systems Research, INFORMS, vol. 30(2), pages 389-410, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:32:y:2021:i:3:p:836-859. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.