Author
Listed:
- Pallab Sanyal
(School of Business, George Mason University, Fairfax, Virginia 22030)
- Shun Ye
(School of Business, George Mason University, Fairfax, Virginia 22030)
Abstract
As more businesses are turning to crowdsourcing platforms for solutions to business problems, determining how to manage the sourcing contests based on their objectives has become critically important. Existing research, both theoretical and empirical, studies the impact of a variety of contest and contestant characteristics on the outcomes of these contests. Aside from these static design parameters, a lever organizations (clients) can use to dynamically steer contests toward desirable goals is the feedback offered to the contestants (solvers) during the contest. Although a handful of recent studies focuses on the effects of feedback at a high level (e.g., volume, valence), to the best of our knowledge, none has examined the effects of the information contained in the feedback. Furthermore, the focus of the existing studies is solely on the quality of the submissions and not on other critical contest outcomes, such as the diversity of the submissions, which is found to be significant in the creativity and innovations literature. In this study, first, using the psychology literature on the theory of feedback intervention, we classify client feedback into two types: outcome and process. Second, using data from almost 12,000 design contests, we empirically examine the effects of the two types of feedback on the convergence and diversity of submissions following feedback interventions. We find that process feedback, providing goal-oriented information to solvers, fosters convergent thinking, leading to submissions that are similar. Although outcome feedback lacks the informative value of process feedback, it encourages divergent thinking, which is the ability to produce a variety of solutions to a problem. Furthermore, we find that the effects are strengthened when the feedback is provided earlier in the contest rather than later. Based on our findings, we offer insights on how practitioners can strategically use an appropriate form of feedback to either generate greater diversity of solutions or efficient convergence to an acceptable solution.
Suggested Citation
Pallab Sanyal & Shun Ye, 2024.
"An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type,"
Information Systems Research, INFORMS, vol. 35(1), pages 394-413, March.
Handle:
RePEc:inm:orisre:v:35:y:2024:i:1:p:394-413
DOI: 10.1287/isre.2023.1232
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:35:y:2024:i:1:p:394-413. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.