My initial instincts are that Google's got the same types of historical tracking on review and ratings acquisition as they do on backlinks. I feel it's only a matter of time before there's enough data to make a statistical evaluation, which I think is basically how much of the Google algo is currently working.
Over the years I've observed companies that have tried to sell fake reviews to clients drop off the map... and I've seen sites that bought the reviews disappear too.
Right now, I'm seeing lots of variation in the results... stuff that looks like crap is in the top 10 for a few days; then it vanishes; then it comes back, sometimes in a different position.
Results before most major changes can look extemely bad. It's been described as letting the junk rise to the surface so they can skim it off. I think that to get a large enough sampling of what they're trying to get rid of, Google needs to err on the side of allowing it for a while.
It's a mistake to regard Google as amateurs in the longterm. Lots of folks have made money, though, exploiting vulnerabilities for a short time. I gather that's getting harder and harder to do.
And a PS on this one... I'll bet that they're correlating ratings data with social and traffic data and with user behavior, but that the correlations will take time to be collected and to kick in. Just a hunch.
I hope you're right - And it may well be a trap.
I'm going to sit back and hope my company recovers; I've implemented all their widgetry and G+ author rels. I'm forcing thousands and thousands of my 'very uninterested in social media' members to sign up to Google+ and become contributors; I feel a little ill.
Apologies for this mini rant:
When a big ship is rocked this hard, this much; it will eventually capsize. They want a web full of 'quality sites'? Stop dishing out tsunamis, else we'll have a web full of the 'hardest pirates'.
I suggested exactly the same thing in a thread on here yesterday (only partly tongue in cheek) - it is very disappointing and increasingly common to see sites implementing the microdata for no reason except to please and trick G. So much for 'always do what's best for your users'...
But I don't think they are going to disappear any day soon because they are
(1) social, which G love at the moment
(2) a way for G to pull together information about things into mashup pages, for example for travel destinations, and generate income from them. The way I see it, the stars they show in the SERPS are their 'reward' to encourage webmasters to use them - and in due course they are likely to combine the information for their own goals.
Likewise their current task of accumulating millions of items of data about everything possible (to be shown in search results later this year I think - see [webmasterworld.com...] will also be made easier by the use of microdata but is likely to benefit G rather more than webmasters.
Someone less cynical might say it also enables them to provide more accurate search results...
I'm sure that Google will detect algorithmically fake ratings in the future.
However, so far it seems that every kind of manipulation works. I've seen sites just with microdata but no rating option as well as sites starting with showing "2456 ratings" for an entry. So far Google seems to ignore this problem - all these sites are still showing rating data.
|I'm sure that Google will detect algorithmically fake ratings in the future. |
Fake ratings are worse than over optimization. It is black hat and cheating the system + searchers. Google should detect them and apply a sitewide -500 penalty or something else that is worse than being pandalized.
|it is very disappointing and increasingly common to see sites implementing the microdata for no reason except to please and trick G. So much for 'always do what's best for your users |
This doesn't even make sense. Why do you think microdata even exists?
We just added rating rich snippets (legitimate to our actual reviews) and they started to show up in the SERPs yesterday. Hoping it will help increase our CTR and make up for some of the lost traffic. I did think about how it wasn't possible to police the ratings though...
Netmeg, I don't know what doesn't make sense but I'm pretty sure microdata doesn't exist to get one over on google but rather to help them to improve their SERPS - but that doesn't work if the results aren't policed in any way, it just becomes another grey hat SEO technique.
Real world example: yesterday I was looking at a profitable search result and noticed the first result had 5* showing and a 'review' - but I knew the site was just an iframe and a menu supported by a lot of link exchanges so I was surprised that they had the stars showing or that anyone would submit a genuine review to an affiliate iframe site.
It was clear when I looked at it that the rating was fictitious and added just to get the stars in google serps. Which had worked, so well done to them I suppose. But it means everyone else on the page will now do the same thing.
If a poor site with reviews ranks higher than a good site without reviews, or at least stands out better in the serps because of the stars, then it becomes yet another 'do it to please google' task rather than a 'just carry on doing what's best for my site visitors' task (like splitting stuff off to subdomains without altering it or adding needless text to photo gallery pages so they don't get a site hit by panda etc etc)
Incidentally for certain things I think stars are an excellent idea - hotels where reviews are submitted after people have stayed etc, but think they should only show in the serps for sites like these where the ratings are verifiable, the sites are reputable and the rankings very likely to be genuine.
I am thinking that the review stars could definitely back fire in the world of Panda. For instance....someone clicks on a result because they assume there are actual reviews to be read. When they land on the page they instantly click the "back" button because there are no actual reviews.
a competitor just added the code to their site and now Google shows 5 stars under their description. They are the only site with the snippet, so it really stands out. I imagine this will have a huge impact on their CTR. I wonder if Google is going to encourage this or penalize. After all, it can be manipulated (get all your friends and fans to go to your site and click '5 stars').
Google is certainly encouraging the use of microdata in general. Pretty sure I know where they stand on fake reviews. But they have lots of data, and I suspect that after a while patterns will emerge.
|Google is certainly encouraging the use of microdata in general. Pretty sure I know where they stand on fake reviews. But they have lots of data, and I suspect that after a while patterns will emerge |
True. But how will they be able to determine that the 2,000 votes showing in the snippet are fake? People have to visit the site then vote. If a real visitor generated the vote, then it might be hard to deem it a fake. Apparently for this site, they have had 2,000+ people vote. I honestly don't understand how this thing works, so I am just thinking out loud. I am tempted to add it, if there is a way to do it legitimately (i.e., let my visitors cast an honest rating for my site). Can't be any different than votes being cast on Amazon.
It's also not any different than Facebook Likes or +1's. Ever read about bogus Amazon reviews? They're practically a meme unto themselves. There will always be people gaming any system.
Add the ratings/reviews if you think it will help your users make decisions. That's what I did. On sites where it doesn't make sense, I don't add them.
(Also, it's a lot easier to get legitimate users to rate things than to write reviews, as I have found. People don't mind clicking on stars, but it's not always easy to get them to actually WRITE something, unless they're ticked off)
Netmeg, how long did it take for the ratings to show up in the serps? Did it require a certain number of reviews before they appeared? Also, did the rating snippet increase the CTR to your site (or decrease CTR if a page has a low rating)?
We've only had the ratings a few days, and only on a couple of the most highly visited URLs, so far. I think we launched them on March 18. The rest of the rich snippets seemed to show up in just a matter of a few days.
I believe these ratings don't make any good for the websites.
For ex. the person is searching what others are thinking about the hotel. SERP shows that the website N rates this hotel with 4 stars. That's it. Why this person would visit website N ? I would not.
100% of searchers will visit page without the rating. They just have to, if they want to know about the hotel. With the rating somebody will visit, but somebody will not. What for, if rating is clearly presenting in SERP ?
This is another attempt by Google to get information from the websites for free and provide it on there own website (in SERP).
I refused to add this microdata. And in fact any microdata, that can be shown in SERP.
Nobody says you have to.
|I am thinking that the review stars could definitely back fire in the world of Panda. For instance....someone clicks on a result because they assume there are actual reviews to be read. When they land on the page they instantly click the "back" button because there are no actual reviews. |
Or they spend a long time looking for the non-existence review, in the course of which they read through the navigation and find something interesting to click through to.
I'd say it should be easy for Google to spot the blatant cheats; they know a great deal about the visitor levels to sites and where people go (so sites with little traffic, unless very niche, are not going to get 2356 reviews for a single item, also they could find out if toolbar/chrome users are actually using any review system). That does not mean that Google is currently using that data or even collecting it in that particular way.
I'd steer clear of doing anything unless you really do have the data to back it up.
I think an interesting area to look at is where "reviews" pop up for things that are hard to get reviews for (or even for things that do not have legitimate reviews at all - entire sectors can fall into this definition). If companies such as Google, Yelp and many others cannot get enough people interested in reviewing some verticals then you have to wonder if a goldmine of "reviews" on a small site is genuine (it's either a really good genuine site or it's one that should be dumped from the SERPs).
As for reviews being hard to get, that's so true that recommendations are pretty much the only way to get enough data in many areas. Add into that equation the need for search engines to know who a person is to be able to trust a review/recommendation (statistically speaking) and you can see why Facebook has another ace card that is scaring Google to the extent that they are going all-in on Google+.
I think we are seeing moves by Google that are motivated by the existential threat that Facebook poses, and so far Google has not struck upon a winning strategy (they, or someone else that they buy, may find a way to extinguish or reduce that threat). A big problem for Google is that advertising revenue and social recommendation features do not sit very well together; both Google and Facebook are probably working on how you monetise a system that is driven by public opinion.
Yea, I've noticed this.
Try searching google for "IP lookup". Some of the IP lookup services have star ratings in the hundreds. Why on earth would an IP lookup service have ratings? LOL.
If you click through, there are no reviews or ratings, but the stars next to the search result sure makes them stick out over the others.
I mentioned this in the SERP changes thread, but there were some pretty helpful rich snippets for ex. software applications from many sites like the Android Market which included ratings, software icons and other details. These rich snippets now seem to have been disabled except for the star ratings.
Most of the time, I have found these rich snippets to be helpful, including the star ratings.
>> Try searching google for "IP lookup". Some of the IP lookup services have star ratings in the hundreds. Why on earth would an IP lookup service have ratings? LOL.
I only saw one, but the IP lookup service they are offering is a product and the votes seem to be real as my vote was counted immediately.
I still think star ratings (displayed on search results) should be reserved for review sites. If it were a site reviewing DIFFERENT IP Lookup services, I'd understand.
I dunno; I have event sites, and my clients have product sites. We find the reviews and/or ratings helpful, and believe our users will.
I've noticed questionable review data appearing in the SERPs of late too, and I'm frankly rather surprised. You can see an example [plus.google.com] on a Google+ post I wrote a short time ago (and I've since seen similar types of reviews generate rich snippets).
I'm surprised because Google has traditionally been less rather than more trustful, yet - at least when it comes to producing rich snippets - it seems to be processing review content almost on face value.
I trace this back, at least in part, to January 2012, where a post [insidesearch.blogspot.ca] on Inside Search listed this as one of their "30 search quality highlights":
|More rich snippets. We improved our process for detecting sites that qualify for shopping, recipe and review rich snippets. As a result, you should start seeing more sites with rich snippets in search results. |
The interesting point here revolves around the declaration that this has to do with "detecting sites" - that is, review data may not be evaluated on the individual merit of a review's trustworthiness, but on the overall trust in the site. So "trusted" sites are capable of marking up review content with schema.org microdata and being granted rich snippets for their efforts, without other algorithmic oversight.
What I've found especially surprising is that Google seems to now being routinely honoring review markup, even when the provenance of those reviews is problematic. That is, rich snippets are being generated even in situations where there's no discernible path for a user to even submit a review.
I think this is all related to Google's efforts to promote structured markup, and particularly schema.org microdata (which is what I've almost exclusively seen in the code of sites with "questionable" review rich snippets). A new technology obviously sees the best adoption when there's a demonstrable benefit to using it (witness the rapid ubiquity of "Like" buttons and underlying OpenGraph data, because it obviously improved brands' visibility on Facebook), so Google has been keen to return rich snippets as a incentive for webmasters to markup their own reviews.
But I think it's clear that Google has been too generous in according sites "trusted for reviews" status, and I think we can expect to see that trust level dialed back - or additional algorithmic methods put in place to accord for the quality of review data itself, rather than only assessing a site's trustworthiness.
"count" specifies the total number of reviews for an item. "votes" specifies the number of people who provided a rating, with or without an accompanying review. A review can specify count or votes, or both. However, whenever you include "count", the page must also contain markup for each reviewed item.
|We improved our process for detecting sites that qualify for shopping, recipe and review rich snippets. As a result, you should start seeing more sites with rich snippets in search results. |
This must be a joke. One of the greatest content thiefs and spammers I know is now listed with yellow review stars. Only ... there are no reviews on his website.
just saw another set of stars pop up on a competitor's website in the serps. Over 7,000 "votes" and a perfect 10/10 rating. lol Can't see this lasting long, but I bet it's really giving a CTR boost. Surely it's on Google's radar.
Yet another idiotic idea from Google, I think they must have given a 5-year-old the control over the search engine business...
To be honest, I am shocked the results have not yet turned into "5 star" ratings on every page, lol. Just a matter of time, I guess.
|Yet another idiotic idea from Google, I think they must have given a 5-year-old the control over the search engine business... |
The bad news is that the Google serps are a mess and lower quality than before Panda.
The good news (for Google) is that the general public soesn't realize this yet.
| This 37 message thread spans 2 pages: 37 (  2 ) > > |