|Okay, so... since I've been busy slamming local reviews|
Suggestions for how to 'fix' local reviews
| 2:45 am on Dec 9, 2004 (gmt 0)|
1. Ban repeat reviews by limiting the IP address of the reviewer to 1x per subject matter review. Problem: AOL assigns IP addresses randomly as you logon. (No, I don't use AOL.) Alternative: Can you get lists of the blocks of IP addresses assigned to AOL in a region and back into a solution that way?
2. Flood control: If a company suddenly is subjected to an unusual number of reviews per period of time have your system hit the brakes.
3. IP block control? If the reviewers are expected to be mostly local - can you determine the regional IP blocks and limit the wild hares? That is, reviews falling outside the regional IP blocks are flagged and require further coroboration?
4. Off the continent IP blocks. I predict this business of 'rate Joe's diner' will lead to the outsourcing of 'reviewers'. What do I mean? I mean that there will be cadres of professional reviewers whom, for a small fee, will give Joe's diner a boost. So, what to do? Perhaps you have to block some big blocks of IP addresses. Say your review relates to a Boston diner. Might you decide to block Asia, central Europe, etc. from posting reviews? How plausible?
5. Require some level of corroborating work before a review is posted? Say, require a date and time when they actually visited the business. Check that against a database of the business hours? More work, perhaps too much, but . . . can you think of other ways you might program the system to require an initial reality check? Perhaps they have to first enter a local or regional phone number that can be validated?
6. What else? (Please feel free to slam my head in the door for being so clueless about the limits of what can be done on the basis of IP controls.)
My point is simply this: I've been busy talking down the idea that local reviews will be a boon to the growth of local info sites. I see many problems. However, my nature is to see a problem and then attempt to fix it.
So, what are all the problems you can list or conceive of regarding reviews of local businesses AND what can you think of that might be a solution - or an approach to a solution - that might address THAT particular problem(s)?
This is a brainstorming or share your solutions post.
| 10:40 pm on Dec 9, 2004 (gmt 0)|
Hmmmm. Maybe it's not that bad? Maybe it's all working rather nicely? No issues on the horizon? Okay. My bad. I thought I heard a skunk stirring in the woodpile.
And here I thought it was broken. Or, maybe I've just exhausted all the possible fixes in my 1st post in this thread?
| 11:05 pm on Dec 9, 2004 (gmt 0)|
Webwork, as big a problem that local user review manipulation has the potential to be, an even bigger problem is getting people to take the time to leave reviews in the first place.
At least if folks are manipulating the reviews, your user review system is perceived to have value.
So the last thing you want to do, imho, is to put up barriers to leaving a review besides a basic registration. So things like premoderation, you have to qualify to leave a review, etc. - these things are not the way to go - at least at first.
Things you can do to make it hard to cheat or to minimize the impact of cheaters:
- only registered users can rate, each rater needs a verified email address. If account isn't verified, all postings get removed by that user.
- identify characteristics that would seem to reek of cheating - things like an unnatural change in cumulative rating, a surprising number of new ratings of a particular item in a certain period of time, identical user passwords showing up in a certain period of time, etc.). Build filters that bring these occurances to an admin's attention for human review.
- User transparency. Make it clear which posters have only left one review, and which ones are trusted members of the community.
- Put clear buyer beware language on your site as well regarding the accuracy of user driven content.
And what about this situation? A local merchant sends 30 of its employees, friends, and family to rate their estabishment. All unique individuals, all unique emails, all being honest in their assessment of the merchant as "great." These ratings rocket the merchant to the top of their category. Would you consider these ratings legit or bogus?
This very situation happened on our network today. I don't think there is any way to "solve" this problem, if it even is a problem. We typically deal with these sorts of situations by instituting a "cool down" period for the item at hand by pulling it from the listings for a week.
Google's unwillingness to play in this space (so far) should not be surprising given the amount of human intervention required.
| 1:14 am on Dec 10, 2004 (gmt 0)|
|- User transparency. Make it clear which posters have only left one review, and which ones are trusted members of the community. |
Maybe I'm reading this wrong, but what makes a person who has submitted only one review less credible?
Here's a personal example of why I ask. I'm completely new to the local search and user review concept.
So following suggestions to investigate further from posters here I did some varied local searches for local business that I frequent.
In the process I submitted the only user review I've ever written.
Are you telling me that review is less credible because it's the only one I've submitted?
That hardly seems logical, especially considering that one of the difficulties of user reviews is getting people to submit them.
If you devalue one off reviewers comments, aren't you encouraging multiple, or mass, submissions? And if so, wouldn't those multiple submissions actually have a greater potential of carrying less actual value in a system?
| 1:27 am on Dec 10, 2004 (gmt 0)|
Just because someone isn't a 'trusted member of the community' doesn't mean they don't deserve to be trusted.
I hear your point, but from a publisher's perspective, it's all about trying to give your readers enough information to make an educated call on how trustworthy the review is.
To take an extreme example, the WW member "Chicago" has a lot of credibility in this thread because he is the Moderator of the Local Search forum.
That doesn't a mean he is necessarily any more credible on the subject than AlanS who recently posted his first post in another Local Search topic - "Is G Disadvantaged"... - for all we know, AlanS could really be Sukinder Singh, the lead for Google Local, and extremely credible on the subject.
But the point is, we know that Chicago has some degree of credibility, based on the moderator icon next to his name, and his posting history on WebmasterWorld.
AlanS we know nothing about - other than he left one quality post. Maybe he wrote the book on Local Search. Maybe he works for Google - and wants to pitch his employer. Maybe he knows nothing about the industry.
By displaying each user's post history, WebmasterWorld provides us some clue as to the POSSIBLE credibility of the poster.
It's the same thing with local user reviews.
| 1:30 am on Dec 10, 2004 (gmt 0)|
Sorry, one more point - I'm not recommending that any ratings system devalue the ratings from one off posters - I'm just recommending that those sites disclose who the one off posters are, and who the heavy contributors are.