
For retailers and consumers alike, shopping has
become something of an experience of hand-to-hand combat.
Consumers arrive at their digital devices looking for a hotel room or an outdoor grill and find themselves
immediately overwhelmed with choices.
A quick scan for price can narrow the options a bit, though you find that separating a great hotel from a cheap but creepy one is very hard
indeed. Star ratings seem to have lost any differentiation. Almost all reviews are now 4 stars.
But here’s where consumer-generated content comes to the rescue.
Real
consumers, real voices, real opinions -- at least, that what you hope.
So, with Amazon’s Prime Day now on everyone’s mind, you can’t blame consumers for a bit of skepticism
about which product “deals" are really deals.
advertisement
advertisement
The Competition and Markets Authority (CMA) told The Guardian about “troubling evidence of a thriving marketplace for fake
online reviews” and recommended action.
The CMA’s investigations found fake reviews for cash was a growing problem on Facebook and eBay. The CMA also targeted Airbnb, getting
it to limit reviewers to customers who booked and stayed at a property.
And while enforcement is one solution, building technology that can separate real customers from bots and
fakes may be the best solution. TurnTo Networks builds software that verifies reviewers and makes sure they're actual customers.
“The one problem that’s much more widespread
is that when it’s hard to write a review, people who end up reviewing are the angry ones. It’s distorted,” TurnTo’s CEO and founder George Eberstadt told The Denver
Post.
“What ends up happening is reviews, on the whole, don’t reflect the perfectly happy middle. If incentives gets the perfectly happy people to say,
‘It’s fine,’ we see that as a service.”
So how can consumers tell which reviews are from consumers, and which are compensated? Services like SnagShout Reviewbox,
VocalPoint, and others are able to play inside the lines at Amazon and offer consumers free products to sample and review.
So fake or honest? Good data is hard to find.
A
post from Rafe Needleman on BestReviews suggests what we all fear, that the rating are pretty much rigged. "We looked at 360,000 user ratings across 488 products in various categories
and found what is clearly an unrealistic proportion of 5-star ratings” posted Needleman on Yahoo.
"Other consumer reviews sites have more varied and realistic distributions of user
scores. The closest site to Amazon in its distribution of overwhelmingly positive reviews is probably the ticket-selling site Fandango, where a FiveThirtyEight story from October 2015 shows a
proportion of 4.5- and 5-star reviews of about 50%. That’s high, but it’s still less than Amazon, where our data shows that 66.3% of user-generated ratings are 5-star.”
But the biggest puzzle may be Amazon Social Q&A. The questions are often incredibly obvious -- and the answers often puzzling. “Does this fixture need a plug?” asks one
potential buyer of a light fixture that is clearly marked as being installable in a ceiling electrical box.
Amazon doesn’t want to be in the middle of answering foolish questions.
Often the answer is “I don’t know” or “I returned it.” Also weird.
The only thing I can figure is that it’s a rigged game. Amazon says the AI
isn’t smart enough yet to tell whether it’s a relevant answer or a completely nonsense answer as long as you type some words in the screen.
Consumer-generated content is a
powerful tool. It’s authentic, relevant, and above all, cheap. But it does seem like the value is being diminished by fakes and fraudsters who are gaming the system.
So whatever you
buy on Prime Day, it’s buyer beware.