How Opinions are Received by Online Communities: A Case Study on Amazon.com Helpfulness Votes

Presented at: 18th International World Wide Web Conference (WWW2009)

by Cristian Danescu-Niculescu-Mizil, Gueorgi Kossinets, Jon Kleinberg, Lillian Lee

Webpage: http://www2009.eprints.org/15/1/p141.pdf

There are many on-line settings in which users publicly express opinions. A number of these offer mechanisms for other users to evaluate these opinions; a canonical example is Amazon.com, where reviews come with annotations like "26 of 32 people found the following review helpful." Opinion evaluation appears in many off-line settings as well, including market research and political campaigns. Reasoning about the evaluation of an opinion is fundamentally different from reasoning about the opinion itself: rather than asking, "What did Y think of X?", we are asking, "What did Z think of Y's opinion of X?" Here we develop a framework for analyzing and modeling opinion evaluation, using a large-scale collection of Amazon book reviews as a dataset. We find that the perceived helpfulness of a review depends not just on its content but also but also in subtle ways on how the expressed evaluation relates to other evaluations of the same product. As part of our approach, we develop novel methods that take advantage of the phenomenon of review "plagiarism" to control for the effects of text in opinion evaluation, and we provide a simple and natural mathematical model consistent with our findings. Our analysis also allows us to distinguish among the predictions of competing theories from sociology and social psychology, and to discover unexpected differences in the collective opinion-evaluation behavior of user populations from different countries. Categories and Subject Descriptors: H.2.8 [Database Management]: Database Applications – Data Mining General Terms: Measurement, Theory Keywords: Review helpfulness, review utility, social influence, online communities, sentiment analysis, opinion mining, plagiarism.

Keywords: Data Mining


Resource URI on the dog food server: http://data.semanticweb.org/conference/www/2009/paper/15


Explore this resource elsewhere: