homepage Welcome to WebmasterWorld Guest from 54.167.144.202
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 186 message thread spans 7 pages: 186 ( [1] 2 3 4 5 6 7 > >     
Google's 950 Penalty - Part 5
steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 11:44 pm on Feb 12, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

"That's exactly the sort of sites I'm referring to"

Unfortunately some comments about this issue apparently can't be bothered to actually, horrors, look at the serps. Authority has a specific meaning with Google, and its plain that authority sites are what are commonly, mistakenly hit by this penalty. I don't think this is a good summary of the effect, but one simplistic way to look at would be to say authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links, sometimes.

One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them. Clearly a a ranking mistake is being made (either the 950 result, or all those #1's).

[edited by: tedster at 9:17 pm (utc) on Feb. 27, 2008]

 

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 12:21 am on Feb 13, 2007 (gmt 0)

"Had one directory's worth come back today, but most unaffected. No changes made to the one's that cam back, but they did get fresh tags. Previously they died one time before after the fresh tag expired."

And poof, gone the next day.

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3250836 posted 1:01 am on Feb 13, 2007 (gmt 0)

One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them.

Precisely the situation I'm seeing. Very strange.

And poof, gone the next day.

Have you seen the same site repeat the top rank=>950=>top rank cycle more than once?

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 1:02 am on Feb 13, 2007 (gmt 0)

Anyone think redundancies on sites could be playing a part?

Do you mean redundancy in terms of how the navigation is set up on each page or do you mean something more?

authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links

Plus authority sites are bombarded with inbound links from scraper sites.

tflight

5+ Year Member



 
Msg#: 3250836 posted 1:12 am on Feb 13, 2007 (gmt 0)

Have you seen the same site repeat the top rank=>950=>top rank cycle more than once?

I've seen pages on one site make six full cycles (drop, recover, drop, recover, drop, recover, drop, recover, drop, recover, and one final drop) since late June. When pages recover they are in the top 10, normally top 5. When the site drops I rarely check to see just how far, sometimes it is down around 150 while other times it is on the last page (950+).

AndyA

5+ Year Member



 
Msg#: 3250836 posted 1:22 am on Feb 13, 2007 (gmt 0)

So now suddenly it's a bad thing to have an authority site with on topic incoming anchor links?

And perish the though that we actually *GASP* title the page to describe what it is about, which is also coincidentally the same text incoming links have, incoming links which we very often HAVE NO CONTROL OVER?

And let's not optimize the page meta description or H1 using the same text, because then it becomes spam.

This just sounds ridiculous to me.

And the scraper sites are Google's problem. Google needs to find a way to fix it so they have no effect at all on the sites they are scraping. They just need to make the scrapers disappear as soon as they find them, and the scraper sites will cease to exist because they won't be worthwhile for the thieves who own them.

If you have a page about Blue Widgets, it's not too much to anticipate that the term "Blue Widgets" is actually going to appear on the page a few times.

Biggus_D

5+ Year Member



 
Msg#: 3250836 posted 2:07 am on Feb 13, 2007 (gmt 0)

Right now we are on the second round

top rank(months)=>950(3days)=>top rank(2weeks)=>950(+4days)=>?

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 2:48 am on Feb 13, 2007 (gmt 0)

"Have you seen the same site repeat the top rank=>950=>top rank cycle more than once"

It's pages, not a site. But yes, many times. Some pages maybe seven or eight times. Others four or five times. No page I know of has been penalized every single day since the September 2005 first sighting of the phenomenon.

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3250836 posted 2:49 am on Feb 13, 2007 (gmt 0)

I've seen pages on one site make six full cycles

Was there any appreciable variance in either the top or bottom positions with each cycle?

I'm just wondering if it's possible they are testing a routine that is making small refinements to the index with each pass through the data, since one of those papers referred to the monumental task that certain types of Phrase-Based Reranking would be.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 3:01 am on Feb 13, 2007 (gmt 0)

And perish the though that we actually *GASP* title the page to describe what it is about

My only hope is that Google will figure out the damage they are doing to their serps and look at some alternative methods. Of course the real villains are the spam sites. I suppose someday they may just take over the Internet and it will be all gibberish.

tflight

5+ Year Member



 
Msg#: 3250836 posted 3:15 am on Feb 13, 2007 (gmt 0)

Was there any appreciable variance in either the top or bottom positions with each cycle?

To be honest, I'm not sure. At first I thought pages were only falling back to positions somewhere around 100-200. But then I realized that the keyword tracking system didn't account for the fact that it was actually a different page on the site that was being returned in that position.

In other words take one page about Super Widgets. It used to rank in the top five. Then the keyword tracker would report my site ranked 175 for that phrase, however it was actually for a different page on my site that happened to mention Super Widgets. The page which is dedicated to Super Widgets was actually much, much further down in the results.

I can say that there wasn't much variation at the top. I don't know about the bottom though.... as far as G referral traffic goes there isn't much difference between being ranked 50 and being ranked 950.

vikram lashkari

10+ Year Member



 
Msg#: 3250836 posted 9:03 am on Feb 13, 2007 (gmt 0)

Right now we are on the second round
top rank(months)=>950(3days)=>top rank(2weeks)=>950(+4days)=>?

One of our client who is also facing same situation and he is on Top Rank round. He just recovered yesterday only, hope this last for ever now. He cannot take such inconsistency any more.

thanks
lashkari

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 9:50 am on Feb 13, 2007 (gmt 0)

And the scraper sites are Google's problem. Google needs to find a way to fix it so they have no effect at all on the sites they are scraping. They just need to make the scrapers disappear as soon as they find them, and the scraper sites will cease to exist because they won't be worthwhile for the thieves who own them.

If Google would stop PAYING people for putting up scraper sites, there wouldn't be any more motivation, so people might stop - except the ones who get swindled into it.

[edited by: Marcia at 9:52 am (utc) on Feb. 13, 2007]

dmaniatis

5+ Year Member



 
Msg#: 3250836 posted 10:43 am on Feb 13, 2007 (gmt 0)

We were hit with this penalty about a month ago and I had then posted a message in the original 950 penalty thread ( [webmasterworld.com...] ) saying how we were probably an easy target for this penalty.

Since then we have done three things:
- Removed duplicate text (but not the affiliate links).
- Added unique content that is not keyword dense.
- Slightly de-optimized keyword density on title, headers and internal links.

In the past few days we are starting to see a gain in the SERPs and now rank at around 20+ for the pages that have the new structure and have been crawled by Google. (Originally we where in the top 10).

HTH

nuthin

10+ Year Member



 
Msg#: 3250836 posted 2:49 am on Feb 14, 2007 (gmt 0)

The filtering is so random. I wonder what would be triggering it. I mean for instance I type in "keyword phrase location here" and I'm not there. However in one instance for a page if you type in "location here keyword phrase" it's #1.

If it's an anchor text thing you would think it would only effect the index page where alot of the anchor text links point to for alot of sites and not the inner pages where the content is.

Just removed the anchor text on a few links that are pointing to my site that is effected, see how it goes anyway.

zafile



 
Msg#: 3250836 posted 4:19 am on Feb 14, 2007 (gmt 0)

"Just removed the anchor text on a few links that are pointing to my site that is effected, see how it goes anyway."

Good luck.

However, IMHO I think changes applied to a Web site don't have an immediate effect.

Google should be making quite difficult for Webmasters to notice which changes have a direct impact on the ranking. Otherwise, it'll be pretty easy to figure out how to game Google.

nuthin

10+ Year Member



 
Msg#: 3250836 posted 4:33 am on Feb 14, 2007 (gmt 0)

"However, IMHO I think changes applied to a Web site don't have an immediate effect."

That's ok. I'm not going anywhere.:-)
This particular site got filtered out once before so I removed one particular site-wide anchor text link and it came back in a few days. Co-incidence.. maybe?

Who's to say what has an immediate effect or not, highly crawled sites always have a better chance of having a quick turn around on tests that you decide to conduct.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 6:18 am on Feb 14, 2007 (gmt 0)

It's seems like the changes I've made took effect as soon as the page was spidered. That is the ones where the changes worked.

Pamela2

5+ Year Member



 
Msg#: 3250836 posted 1:47 pm on Feb 14, 2007 (gmt 0)

You know, I wonder how many people are like me. This happened to our site back in November. I searched the net to see why Google was broke and found Webmasterworld. Luckily we have no financial imperitive, so I kinda shrugged my shoulders and just banned Googlebot with robots.txt to stop them consuming so much bandwidth for no return. I'd bet money that there are plenty of others.

I keep coming back because it is fascinating in its own peculiar way. It's a bit like watching the Google alternative dream morphing into the big bad Microsoft machine. They have turned into one big ugly robot that really doesn't care what it hits so long as it makes more and more money. If little guys with really cool sites are being battered, it is just collateral damage. I guess it's disappointing.

"If Google would stop PAYING people for putting up scraper sites.."

And that too. They could do that, but how motivated are they to do it? They don't seem to be at all.

[edited by: Pamela2 at 1:48 pm (utc) on Feb. 14, 2007]

seo_joe

5+ Year Member



 
Msg#: 3250836 posted 5:07 pm on Feb 14, 2007 (gmt 0)

Pamela, you can reduce the crawl rate of googlebot in google webmaster tools if it is consuming too much of your resources. That way, you at least have a chance of your rankings returning sometime.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 11:29 pm on Feb 14, 2007 (gmt 0)

I now have a couple of incidents where the once missing page is now showing up in the top 10 on the extended cluster. Of course people rarely use the 'more results' option so it doesn't help people find the pages.

BUT it tells me the pages are no longer way down at 900 something. Problem is that I don't know if I still need to reduce possible problem phrases or did I de-optomize the page so much that is only ranks in that extended cluster?

Biggus_D

5+ Year Member



 
Msg#: 3250836 posted 11:37 pm on Feb 14, 2007 (gmt 0)

Searching for more info I've read that in 2005 and 2006 a company that runs blogs (not so so famous) was hit twice, after reaching a record in visits, losing 90% of visits from Google.

The first time the penalty (?) lasted 3 months (on 1 blog) and the second time (on another blog) one month.

They didn't change anything.

[edited by: Biggus_D at 11:40 pm (utc) on Feb. 14, 2007]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 11:54 pm on Feb 14, 2007 (gmt 0)

annej, I'd leave it alone instead of tinkering too much. It's like trying to hit a moving target.

nuthin

10+ Year Member



 
Msg#: 3250836 posted 1:32 am on Feb 15, 2007 (gmt 0)

I agree... I know it's probably very tempting to undertake some pretty drastic changes, I just hope a lot of yous here are able to absorb the downtime in revenue that your site(s) might be experiencing.. whilst Google gets it's act together. :-)
Until then.. just go about your daily routine as normal like Google doesn't even exist. I know, hard isn't it.

VNelson

5+ Year Member



 
Msg#: 3250836 posted 1:55 am on Feb 15, 2007 (gmt 0)

Hi everyone,
I have spent a couple days reading through all this and would like to describe our situation and my theory and get your feedback. Thanks in advance for reading.

We have been hit with a phrase-related drop in Google since Feb. 1. For clarity, the same pages that have been dropped from one keyphrase serp (blue widgets) are easy to find if you make the search more specific (dark blue widgets). It seems to have affected only a few major keyphrase searches but not all our high ones. I think the drops are correlated with higher traffic keyphrases - the more referrals for that keyphrase, the more likely a drop.

First of all, I assume that there is a *phrase-based* point system in effect (similar to email spam filters) so that no small group of issues could explain anything in the SERPs (and yet people talk on the forums as if it could be one or two things). For instance, you do A and you gain X points, and you do B and you lose a few points; and at the same time, inbound links are doing C and you gain or lose points (e.g., lost wiki links), and so on.

As one major factor in your keyphrase score, I think Google is using more and more user data to decide what people like for certain search phrases. For instance, the more people bounce back quickly to try the next result, you lose points. Bounce rate and related issues are mentioned in the Google patent. I think user data is collected via the Google Toolbar, in addition to Google SERP reactions. It says right in the Toolbar info that they will use people's browsing statistics to help improve their system, unless you click or unclick some feature. SO, that said, if people with the Toolbar are going to your site, and the majority of them are not easily finding what they were looking for, your score will eventually go down for that phrase.

The user data factor helps explain why the higher traffic phrases are more affected. The more people that visit it and don't find something, the faster your keyphrase score will decline. The #1 spot would be the most susceptible to this because people click on it mindlessly. Just because you think it's useful doesn't mean the average searcher using a certain keyphrase will find it useful. Maybe their reading level is different or the site isn't user-friendly to them, etc.

When I compared our pages with the new top 10 serps, I could see why those sites might be more satisfying and more user-friendly for a lot of users. We are revising our layout, and this may or may not be related, but a few of our lost keyphrases started recovering, about a week after the changes. We still have some new layout improvements being rolled out shortly.

In sum, I think it is multiple factors working together, and I think user data input is probably one of the biggest and fastest-growing factors in the Google serps.

I would appreciate hearing any feedback or where you think I might have a flaw in my theory.

Thanks
Val

kevsh

5+ Year Member



 
Msg#: 3250836 posted 2:38 am on Feb 15, 2007 (gmt 0)

Val, nice theory but it doesn't come close to explaining why all the spammy .edu pages are still riding the top of the SERPs nor why a page would drop to the very end of the SERPs.

I'm absolutely certain in at least some cases (including my own) there is no possible way the bounce rate could be so low as to be penalized so badly. Look at some examples or browse some results yourself and you'll see a few sites at the back that simply defy logic if your theory is true.

Of course, I think it's fair to say that bounce rates could be (are?) a part of the overall score but not to the degree you are suggesting.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3250836 posted 4:32 am on Feb 15, 2007 (gmt 0)

The #1 spot would be the most susceptible to this because people click on it mindlessly.

I've thought of this. The higher a page is in the serps the more likely it will get casual surfers who are just wandering about glancing at this and that.

Your regulars will stay longer along with those who are referred from a related site, through a mail list or forum or through a friend.

Of course some who are really interested in your topic will find the site or page through search engines but others will just surf on.

soapystar

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3250836 posted 9:35 am on Feb 15, 2007 (gmt 0)

except you can bring a page back just by having someone link to it which does not fit in with it having anything to do with click rate....

ubuntuguy

5+ Year Member



 
Msg#: 3250836 posted 10:19 am on Feb 15, 2007 (gmt 0)

i'm seeing huge differences between the results of Google Latvia (google.lv) and Google UK

google uk = new serps?

tflight

5+ Year Member



 
Msg#: 3250836 posted 3:10 pm on Feb 15, 2007 (gmt 0)

Val, I think your theory is a step in the right direction in figuring out what is going on. Perhaps even toolbar data isn't being used, but raw data collected from the SERPS.

Here is why I think you are headed in the right direction...

Another symptom seen is that the rank of pages for a particular phrase can go way to the bottom one week, then climb back up the second week after a "data refresh". So what would cause the yo-yo?

Think about it like this. Let's pretend for a minute that the bounce rate from the SERPS is a much larger part of the score than it used to be. The bounce rate is something that would need to be continually counted and evaluated over time.

So let's say I have a page about blue widgets that ranks in the top 10. For whatever reason users were not highly impressed with the page and over the course of a couple of weeks the page got a high bounce rate. Google sends the page way back in the results.

Another week passes. Further back in the SERPS the page wouldn't accumulate a lot of bounces since not many people will go back that far in the SERPs. Without much more bounces being recorded for that page, it seemingly recovers without the webmaster having made any changes to the site.

This makes sense for a few other reasons as well. When MC has talked about sites impacted by this phenomena he consistently references making sure your site is useful to visitors. Many of us have taken that as a very vague answer. But perhaps it is more specific then we think. How could google measure how useful a visitor might think a site is? By measuring bounce rate.

Now the first argument against this theory is that Google wouldn't use something like this because it would be too easy to manipulate by users.... Just search for your own pages and click on your own pages in the results without going back to the original query to click on something else.

My response to that argument is that Google already has very sophisticated algorithms to detect "click fraud" from AdSense/AdWords. It would be easy for them to apply this knowledge (multiple clicks from the same IP, too many clicks too soon from different IPs, etc) to the SERPs.

Something else that supports this theory is that some people have had success taking the content of a particular page, putting it up under a different URI, and then the page seemingly recovers. In this case Google would no longer have the historical "bounce" data from the new URI and it would seemingly avoid the penalty... perhaps until new bounce data is gathered.

I think it is also worthy of looking at this from a data storage/retrieval/scoring view. These "data refreshes" only used to happen every few months. Then last summer they started to happen about once per month. During the fall they moved to about weekly, and then more recently MC says they happen almost daily.

So whatever factor is causing pages to take swings is something being calculated as part of the data refresh. Therefore whatever type of data we are trying to identify as causing the problem is something that can only be calculated over time... It doesn't seem like something that can be calculated "on the fly".

Thus, I think the answer to these data refreshes is to look at factors that would likely be something that they would need to calculate over time and not something that could be calculated "on the fly". Things like H1 tags, title tags, keyword density, etc could probably be looked at on the fly from the cache. Things like bounce rate, the number of recent new backlinks found for that page, etc are likely aspects that would need to be calculated behind the scenes and fed to other parts of the algo as the data is calculated.

Let's face it, many of us have looked at dozens of other sites impacted by this and we've found very few similarities between the sites. Therefore on-page factors seem unlikely. I think the answer lies in off-page data Google is calculating.

This can also start to explain why some pages on a site with the same template are impacted, but other pages are left alone... For whatever reason the user data Google is collecting is different for those pages.

And this could even explain why it is most often the more popular pages/phrases that get sent to the back of the results... More user data for Google to make an evaluation from.

I could be completely wrong on all of this.... but so far it is the only theory I've seen that makes sense for the majority of sites I've seen impacted by it.

This 186 message thread spans 7 pages: 186 ( [1] 2 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved