Optimal Aggregation of Consumer Ratings: An Application to Yelp.com
AbstractConsumer review websites such as Yelp.com leverage the wisdom of the crowd, with each product being reviewed many times (some with more than 1000 reviews). Because of this, the way in which information is aggregated is a central decision faced by consumer review websites. Given a set of reviews, what is the optimal way to construct an average rating? We offer a structural approach to answering this question, allowing for (1) reviewers to vary in stringency (some reviewers tend to leave worse reviews on average) and accuracy (some reviewers are more erratic than others), (2) reviewers to be influenced by existing reviews, and (3) product quality to change over time. We apply this approach to reviews from Yelp.com to derive optimal ratings for each restaurant (in contrast with the arithmetic average displayed by Yelp). Because we have the history of reviews for each restaurant and many reviews left by each reviewer, we are able to identify these factors using variation in ratings within and across reviewers and restaurants. Using our estimated parameters, we construct optimal ratings for all restaurants on Yelp, and compare them to the arithmetic averages displayed by Yelp. As of the end of our sample, a conservative finding is that roughly 25-27% of restaurants are more than 0.15 stars away from the optimal rating, and 8-10% of restaurants are more than 0.25 stars away from the optimal rating. This suggests that large gains could be made by implementing optimal ratings. Much of the gains come from our method responding more quickly to changes in a restaurant’s quality. Our algorithm can be flexibly applied to many different review settings.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by National Bureau of Economic Research, Inc in its series NBER Working Papers with number 18567.
Date of creation: Nov 2012
Date of revision:
Contact details of provider:
Postal: National Bureau of Economic Research, 1050 Massachusetts Avenue Cambridge, MA 02138, U.S.A.
Web page: http://www.nber.org
More information through EDIRC
Find related papers by JEL classification:
- D8 - Microeconomics - - Information, Knowledge, and Uncertainty
- L15 - Industrial Organization - - Market Structure, Firm Strategy, and Market Performance - - - Information and Product Quality
- L86 - Industrial Organization - - Industry Studies: Services - - - Information and Internet Services; Computer Software
This paper has been announced in the following NEP Reports:
- NEP-ALL-2012-12-15 (All new papers)
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Benjamin Edelman & Micahel Luca, 2014. "Digital Discrimination: The Case of Airbnb.com," Harvard Business School Working Papers 14-054, Harvard Business School.
- Michael Luca & Georgios Zervas, 2013. "Fake It Till You Make It: Reputation, Competition, and Yelp Review Fraud," Harvard Business School Working Papers 14-006, Harvard Business School.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ().
If references are entirely missing, you can add them using this form.