Forum Moderators: Robert Charlton & goodroi
"That's exactly the sort of sites I'm referring to"
Unfortunately some comments about this issue apparently can't be bothered to actually, horrors, look at the serps. Authority has a specific meaning with Google, and its plain that authority sites are what are commonly, mistakenly hit by this penalty. I don't think this is a good summary of the effect, but one simplistic way to look at would be to say authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links, sometimes.
One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them. Clearly a a ranking mistake is being made (either the 950 result, or all those #1's).
[edited by: tedster at 9:17 pm (utc) on Feb. 27, 2008]
Another important point is that most searches are done using phrases, and that's what the majority of sites have their rankings for - not single words - or don't have rankings for, and this is nothing new.
Usage factors may or may not be used, and if so there's no way it would be an important enough factor to trigger serious penalties or filtering.
Why not? Take a given search phrase, "Big Blue Widgets". Take the top 10 results. If 9 of the pages in the top 10 have a bounce rate of say 10% yet one of the other pages in the top 10 has a bounce rate of 60% then I (as a search engine) would really start to have my doubts about the page with such a high bounce rate.
If I'm trying to build the best search engine I can and the users are not finding what they want on that page then I (as a search engine) would not want to keep showing that page in the top of the results.
but penalizing pages because they are popular and sticky seems even too weird for them
I'm not suggesting that they are penalizing pages because they are popular. I'm sort of suggesting the opposite... that for whatever reason the pages are ranking well for a certain phrase but based on user data Google finds that people who search for that phrase and then click on that result end up quickly bouncing. To me that would imply that the page might rank well, but isn't popular with the users who end up going there.
Google, just tell us honest people what to do. If you're going to kill us off permanently, just tell us so we can look for other jobs. If you're going to bring us back in a month, tell us that so we'll be able to hold on. If you want us to change something, tell us and we'll do it.
1.29 visits/visitors
7.1 pages/visit
43.7% added to favorite
Penalized sites, what are your stats? I bet we will see a common factor here.
Trinorth,
Nope, dont agree at all. One quality, popular site hit:-
1.45 visitors
9.46 pages/visit
60.9% favorite
That blows that out the water!
Also, one could argue that google wants weaker serps with sites with a higher bounce rate, ie turn the quality down so that more users return back to the search and perhaps click on a sponsored advert?
Call me cynical! but the current serps reflects that intentional or not!
Looking at the bounce rates for those specific scenarios (rather than bounce rates for the whole site or even bounce rates for a specific page not tied to a specific query) then I begin to see a pattern emerge.
You could look at it the opposite way- if the surfer doesn't find what they're looking for in one page view, but instead has to jump around a site, that could almost be taken as a negative.
I think it goes both ways.
I just got off the phone with a customer (another state). I googled a two term search for top 10 results and asked him to google the same keywords. My website was in the top 10 in my search (as has been for months) but it did not appear on his search top 10. I asked him to reset his preferences to 100 and redo the search. My website didn't show up in the first 100 results in his search. I then asked him to google my company name which came up first. Then another keyword that I knew we ranked high on. It came up as expected also. I had no way of finding out the IP of his google search.
I then called a couple of friends (locally) to repeat the searches. We each came up with wildly differing results for the first 2 word term and slightly different results for the other terms we checked.
People reporting the penalty coming and going may want to try my experiment.
I think bounce rates and other user data are definite factors, among many, in something like a point system. I think your points affect both your rank and any penalties. If your points go over some line, I think a penalty seems reasonable, such as with an email spam filter. It's like a glass that gets filled and then starts to pour over the edge, triggering a penalty. For one site, bounce rates could put them over the edge, and for someone else, something else does it. In theory, addressing any set of causative issues could get you back below the edge, even if those issues weren't the final straw that broken the camel's back.
If spam is what they are cracking down on (and to me, they are succeeding more than failing at that), then an email spam filter model seems logical to me. Of course, some good sites will get caught in any filter, as with any filter or mechanical process, and that's why they are constantly tweaking the system. To me the snags don't prove it's not a spam filter, it just means it isn't a perfect filter.
I think tflight's reminder about Matt Cutts comment is a good point:
When MC has talked about sites impacted by this phenomena he consistently references making sure your site is useful to visitors. Many of us have taken that as a very vague answer. But perhaps it is more specific then we think.
What we're doing with this idea is to honestly look at our sites and make them as useful and usable as possible. There's always more that can be done along those lines. If it doesn't help with Google, it will pay off in other ways. Luckily Yahoo and MSN still give good ranks for those dropped phrases in Google.
Thanks again. I look forward to continuing to read your comments and will let folks know if anything new happens on our end.
Thanks
Val
My point is to make sure you are checking your rankings across datacenters before making conclusions otherwise it could make things more confusing.
I'd also like to know what people are using to check different datacenters. I've tried a "Google Datacenter Watch Tool" and I have yet to find any datacenters with different results so it hasn't revealed anything to me yet, but I don't look that closely and maybe I'm not using it right.
Thanks!
I found some key phrases that I suspected were the problem. Then I searched my site for those words and checked how the other pages with those phrases were doing in the serps. Sure enough some other pages with the phrases were down as well.
You are right that it's not the phrase itself. In these cases I think it was the number of repetitions of the phrases. I don't completely get rid of these phrases, the page wouldn't even make sense if I did. I just decrease them.
Most of the phrases I suspect are the problem are heavily advertised by MFAs and there are a lot of ads using the phrases.
These visitors convert to customers within the normal range, so there's no problem with bounce rates or stickness.
Here's the problem - only 1 2 word phrase ranks in the top 10. All other words that used to rank have disappeared, they are below 100 in ranking.
to me this isn't a 950 penalty. This is slow death.
We're loosing 60% of our business. Site hasn't substantially changed, other than new prices and products and usual updates. Site is over 5 years old. Site has plenty of good backlinks.
Google needs to take some additional responsibility here.
I will hope to see my site back on its role, but since google is unpredictable I wouldn't expect much.
All i know is that Google does suck sometimes!
this has the effect of pushing down more detailed niche sites with lots more content than wikipedia about those historical figures.
i've also noticed that for pretty much anything i search on regarding a certain big global conflict that happened about 50 odd years ago - wikipedia comes in at no.1 for all the search terms i typed in.
i would guess that the niche sites, full of good content, are getting absolutely killed by this.
if i want wikipedia, i just go to wikipedia - i dont mind them turning up in the SERPs , but to turn up at no.1 for pretty much ALL the stuff i searched on? thats ridiculous.
Looks like a few others are in the same boat. So it seems to me they've added to the number of sites penalized and pushed us up a bit to make room ... thanks, I was actually getting a small bit being on the very last page, this is worse!
if i want wikipedia, i just go to wikipedia - i dont mind them turning up in the SERPs , but to turn up at no.1 for pretty much ALL the stuff i searched on? thats ridiculous.
Yapp, Google is telling us their search engine is increasingly irrelevant and we should all go to Wikipedia and then possibly to Google. Not even only that, WP is also parasiting on the system, by having nofollow in their links. The guys and girls in the plex seem to be really excited about handing their business over to Jimbo's third yacht. :)
I've recently subdivided my topics so I don't more than 10 linked to each other and it seems to have helped get some pages back.
We know long lists of links in a page's navigation is not good for usability but it may not be good for Google ranking either.
if i want wikipedia, i just go to wikipedia
I've noticed for a long time that large sites like "wikipedia" and "about" are able to get near or at the top on just about any search they have articles on. There is something about these huge sites that gives them an advantage but I think it's just a result of the overall algo.