| This 201 message thread spans 7 pages: < < 201 ( 1 2  4 5 6 7 ) > > || |
|Google's 950 Penalty|
What do we know about it, and how do we get out of it?
| 2:05 pm on Jan 11, 2007 (gmt 0)|
I've read a lot about Google's -30 Penalty, where pages on a site drop 30 positions, but most of the comments about the 950 Penalty, where pages on a site drop to the very bottom or last page of the search results, have been comments made in other topic threads.
What do we really know about this penalty, what causes it, and most important of all, how do we fix our sites to restore normal rankings?
| 6:06 pm on Jan 16, 2007 (gmt 0)|
|I used one of the keyword density tools available through a search at our friend Google, and verified those same keywords/keyphrases. I then reduced keyword density by 20 to 30% for those search terms, and voila - back to normal ranking in SERP across all data centers. Took a little over 24 hours. |
Thats not a very controlled experiment, was the cache date on the page with reduced keywords new in Google? If not, then Google was most likely using your older page anyway. It is doubtful that that effect could be fixed in 24 hours by reducing on page text and seo. Any one else?
| 6:13 pm on Jan 16, 2007 (gmt 0)|
I am seeing a trend on the sites I work with where pages ranking for keywords in Google whereby that page just recieved higher pagerank links with that exact anchor text are demoted ala 950 style.
The same page which has not had any noticable recent links with the other keyword still rank high for the other keyword. Note that the links for this particular site are all one way, and the links that came to us were given to us without our knowledge. I now see this for more than one website.
Randle, did you experience this? Anyone else see a trend?
| 6:29 pm on Jan 16, 2007 (gmt 0)|
"very controlled experiment" or not, it worked so I'm happy. No, I don't allow pages be cached by SEs (at least not the way you mean), but I did remove one word from the site description and the new description displays in SERP.
I also removed 301 redirects of defunct pages that were now pointing to home page.
| 9:27 pm on Jan 16, 2007 (gmt 0)|
We have not been able to identify any definite trends, on page or off. Our sites that got hit range quite a bit in terms of how much optimization they have. In addition, we have sites very similar to these that did not get affected.
But, the first thing that really weighs on your mind in times like these is an OOP, and thatís what were thinking about now.
| 9:48 pm on Jan 16, 2007 (gmt 0)|
I am seeing highly optimized directory sites with cat > subcat > listing detail format getting this last page penalty. Some have adsense ads, some do not. Many are authoritative sites, for example, regional listings of a leading law firm directory are relegated to last page, along with me. Def seems to be repetition of keywords on page title and in headlines on page. The sites affected seem to be optimized, easy to navigate, frankly good sites, and they have been replaced by less well optimized, crappier sites at top of listings. I am stumped by this one - could it be a matter of a couple of weeks or so to come back stronger? I was hit by something similar a few years back, lasted a month and then came back. I do not feel I am overoptimized, but rather using good solid hierachical navigation with keyword rich linking. Isnt that how it was supposed to be?
| 9:53 pm on Jan 16, 2007 (gmt 0)|
i agree with PhattusCattus, we have a series of sites with homes>state>city>listing structure and all are getting hit badly , often appearing a few from the bottom. I dont think its keyword density so maybe some sort of repetition of .widgets in california'>'widgets in orange county' etc. however, this seems to go against what logical structuring is.
| 10:37 pm on Jan 16, 2007 (gmt 0)|
I was suffering from the same type of penalty where my site would appear near the end of the SERPs for certain keywords that we were ranked very high for but ONLY ON SOME DATACENTERS. On other datacenters we still had all the same high rankings (anyone else check rankings across datacenters?). Today the rankings are the same across datacenters and we are pretty much back to the good results as before - some better, some a few positions worse. I didn't make any changes to the site so it may have been a coincidence that keyplyr experienced his/her rankings come back today after the changes to his site.
| 11:44 pm on Jan 16, 2007 (gmt 0)|
Anyone think they got hit with this penalty after uploading a sitemap to google for the first time?
On the day that we dropped 950 spots, all of a sudden webmaster tools is telling us that the Google robot could not find certain weird pages on our server(that frankly we never in any capacity claimed existed).
| 12:18 am on Jan 17, 2007 (gmt 0)|
Seems to be across all datacenters. I had 80,000-plus pages in index, then submitted sitemap in mid-November and almost immediately dropped to 22,000.
SEOJoe, when did you get penalty, and how long did it take to recover? My penalty seems to have kicked in starting Sunday the 15th.
[edited by: PhattusCattus at 12:20 am (utc) on Jan. 17, 2007]
| 1:56 am on Jan 17, 2007 (gmt 0)|
Ours kicked in between the 15th and the 16th. Which was about two weeks after sitemap was submitted, but 1 day or so before webmaster tools informed us about missing pages (which we never had on our server to begin with). Day after sitemap was submitted, listings in the index went to half. Then 15th or 16th, penalty. Main term (previously page 1) initially on page 10, this evening, on page 6.
And we really don't do any intentional SEO. But several friends added links to us recently.
Now can't even find any of the tons of news articles about us in news archive unless we specifically do NOT use our domain name in the query.
Seems like this was some sort of blocking action.
| 4:51 am on Jan 17, 2007 (gmt 0)|
Ok now we got in real problem.
Did you look at all these:
and this post from Matt
Things are not looking that straight, complex structure try to fool a search engine and you will go straight to h?.
As these are automated stuff and I guess once we remove those unwanted stuff we may go back to where we were. Till then we got cooked.
| 5:25 am on Jan 17, 2007 (gmt 0)|
yeah, well, right now you can put a fork in me, but shouldnt PR and links from .gov, .edu and other trusted sites show that my site isnt spam? i am tweaking, yes, but I also am holding out hope that the penalty will be lifted sooner rather than later ;-)
| 5:58 am on Jan 17, 2007 (gmt 0)|
So if these things will be followed strictly then we will have this effect:
"Call a SEO or do SEO to get out of business or at least from Big G"
Now there is nothing white hat or black hat except for subject centric title and rest all H1, alt tag are junk or of no use at all. If you do something interesting then big g will show a deeper interest in you and he will put you in a deeper position where he can see you but not everybody else.
Enjoy our good karma
| 6:14 am on Jan 17, 2007 (gmt 0)|
Your posting doesn't make much sense.
| 8:37 am on Jan 17, 2007 (gmt 0)|
Webspam = Sum of pages on an seo'd website - Sum of pages on web site of same topic made by someone who has never heard of SEO. It doesn't take too much effort to find a neutral subject of a major keyword (where dirty tricks are not played) and look. Just the fact people go on about total keyword density on page shows maybe they are looking from inside the square.
Perhaps being an SEO (as was) is now the kiss of death :-)
| 1:21 pm on Jan 17, 2007 (gmt 0)|
Ride45 please study the PDF then analyze your site then you will be able to get to the same conclusion which I have posted here.
| 3:33 pm on Jan 17, 2007 (gmt 0)|
Our series of sites is also suffering from this problem. I have to put in a vote of agreement for the fact that Google seem to be targeting this at very retail heavy areas.
My site and top few competitors have dropped out for our 2, 3, 4th keywords say "red widget" "blue widget" and "widget sale" however our main keyword "best widgets" dropped off and returned next day. The number 1 keyword is in our URL but I don't think this is related.
I also investigated datacentres and have found 9 times out of 10 we are out of them all. Every now and again however we will hold our position in the "search in the uk" results (we are a UK based company).
I am going to give it time before I try to make changes, I don't want to aggrivate the problem.
| 4:20 pm on Jan 17, 2007 (gmt 0)|
|Our series of sites is also suffering from this problem. I have to put in a vote of agreement for the fact that Google seem to be targeting this at very retail heavy areas. |
This is absolutely not what I'm seeing. I have pages with NO ads, NO retail, NO ADSENSE, absolutely NOTHING except unique, original content, and they are at the bottom of the results. Many don't even have H1 tags, and have a very low percentage of keyword duplication (- 2%), so no overoptimization on them, either.
There's more to it than just heavy retail, although that could be a factor for some sites, I suppose, because this does not apply to my site at all.
| 4:26 pm on Jan 17, 2007 (gmt 0)|
Agreed, Andy, affecting not just retail but heavy search terms across the board. Def seem to be template generated, dynamic sites with each individual listings on seperate pages with optimized page titles and keyword rich anchor text in intersite linking.
| 5:05 pm on Jan 17, 2007 (gmt 0)|
We appear to have bounced back from this penalty after making significant changes to the site. The bounce back was automatic and within the last 24 hours). I don't know if we match the profile for the 950 penalty per say, but we did have this kind of ranking across the board after top ranking for competitive terms for 3+ years and the profile of our site looks looked similar to what you describe. We have since:
* Removed templating where possible, at least the boiler plate stuff Adam Lasnik referred to a recent posting on dup content
* Removed mecahnisms that automate or generate category specific pages, including paging (page1=, page2=) and temporarily reverted back to a static html site to test the theory.
* Ensured there was not "over categorization", but just a collection of top categories. Esnured that each category page had unique content, unique meta descriptors and unique titles.
* Ensured keyword density for anchor text was moderate and not over optimized there either
* Gave more prevelance via interlinking to content rich, unique articles within our site.
* Cleaned up htaccess and 301 redirected old pages to new pages, removed all unecessary category pages from the server that were likely uneeded but still showing up in the main index. Ensured a 404 status served for all the removed pages, blocked these pages using robots.txt
While we had some hesitation about making drastic changes to soon, we are starting to see positive results already and rankings are starting to return to normal. Perhaps our site had more issues than other sites, and our combination of issues is different than another site's combination of issues, but perhaps there is something in our list of things done that other can take away for their own site.
| 5:48 pm on Jan 17, 2007 (gmt 0)|
I have been reading this thread as i was hit with this loss of serps.
Mine was about 80% of my keywords, i was worried that it was from over aggressive link building.
My website is a new website 4 months on an old domain - 5.5 years
Only external links are to the web guy, and one of my sites (related).
I have a commercial site with no ads and a separate page for every state, and then to the counties.
100% original content
duplicate content on the county pages
Anyway i did not touch the site and it is back today, so it was only for a brief period.
But i did pray to the google gods.
| 6:23 pm on Jan 17, 2007 (gmt 0)|
I am either suffering from this penalty or something similar.
All my new pages since August 2006 appear to have been hit. When each page is first included in the serps I receive what I would consider to be a normal ranking. Then after a period of days or slightly longer the page will bomb. For example from page 2 to page 24. Sometimes I have two pages on a similar subject. This could be a review and a video guide. Both pages will bomb but one page could end up in say position 70 and the other in position 240.
Pages before August 2006 appear to be unaffected. I haven't checked them all as there are too many. There is nothing really different about the pages I have written before or after August 2006.
Before Christmas I requested reinclusion. This appeared to have been granted as all my rankings returned. A week later they all bombed again.
I consider the site to be white hat. I pay no real attention to keyword density, but donít think it is an issue. I just write the pages and do nothing special in terms of optimisation.
I have read various guidelines and cleaned up a few minor points, but to be honest I cannot find much to clean up.
My latest page to bomb was cached by Google on 6th January, but didn't bomb for another 9 days. Not sure what this means.
The site has been online since 1999. It has thousands of pages - all original content. It has price comparisons on product pages but little else in the way of banners or other advertising. I run an RSS feed, but other than that do nothing in terms of linking.
Absolutely no idea what is causing the problem. Didn't even notice until late November as I never paid much attention to rankings before now!
| 6:38 pm on Jan 17, 2007 (gmt 0)|
Nice link to the paper, I had a read.
"Call a SEO or do SEO to get out of business or at least from Big G"
I don't think it means this. In fact, it means that SEO's who spend even more time researching, being careful, finding more natural linking, and are willing to go that last mile will get more work. This is of course only true if this paper and it's suggestions are applied correctly in google, something that has not been the case in the last series of updates I can remember.
The face of SEO is changing, and what we do changes daily. Those that keep on top of information like this and wisely mull it over / consult it's usefulness and message will no doubt be here tomorrow...
| 7:11 pm on Jan 17, 2007 (gmt 0)|
very useful post, andy56, when did you get hit? Was it the 15th?
| 7:25 pm on Jan 17, 2007 (gmt 0)|
Are you saying that after fixing all those stuff your ranking come back to original. I don't expect that ranking for all KW will return back to normal, it will take time till googlebot will confirm that things got fixed on your web pages.
One thing I am not very much sure because if bot is finding a single page which has some kind of problem or a kind of optimization like over KW density, focus on lot of stuff, packed with kw and some unknown problem it will trigger this action and penalty will be domain wide means.
| 7:37 pm on Jan 17, 2007 (gmt 0)|
The timeframe for us was about 30 days from the time we started clean up and we really started seeing the results of it all in the last 24 hours. Incidentally, we finished our clean up just 3 days ago.
You're right, we have not bounced back for all keywords and the results have not been realized for all pages and keywords, but we are seeing positive results on our highest value terms and highest value pages (the big keywords we compete for, and the pages that get spidered the most frequently, have the highest PR and which see the most inbound traffic from the index).
I have fingers crossed that a complete data refresh and deep crawl will realize positive results and most importantly, stablized rankings.
| 8:09 pm on Jan 17, 2007 (gmt 0)|
I think once we will clear all the problem things will come back to normal. I am waiting for one more feedback like you and then I will also start some cleanup operation.
| 8:24 pm on Jan 17, 2007 (gmt 0)|
It kind of seems like Google is attempting to turn everyone who has a website into a SEOer. Because if you somehow get caught in this penalty, filter, or whatever it is, you have to learn all about SEO to figure out what to do to get out of it. Ignoring it does not help, it only seems to make it worse.
I guess the days of just building nice looking sites for your visitors, with interesting pages, adequate navigation, descriptive links, and original content are gone. Because the average webmaster is going to make some mistakes, and Google is going to slam them for those mistakes.
I've spent months going back looking at everything that's happened since my site was first impacted in November 2004. And it has never fully recovered from whatever hit it at that time. Yet I see lots of other pages that provide less information than mine rank far above mine.
And during this time, new content is close to non-existent on my site, because I'm spending all my available time trying to figure out what's wrong, and trying to implement a fix. I don't want to be a SEOer. I want to build pages that people can find that will be helpful to them. Yet I cannot do that because I must learn SEO, whether I want to or not, or accept the fact that my site will always be buried in Google.
I do have a following that is devoted to my site, and that makes it somewhat worthwhile, but they have also expressed frustration that they can't find my site in Google. One guy told a friend to Google for a search term on my site. The guy did so, used the exact phrase in the title of the page, the only page on the Internet with that title, and couldn't find it. Out of frustration, he called his friend, who had to go find the URL and give it to him. Both of them were amazed that it wasn't listed in Google. Well, it is: it's at #755 out of 813 results, and I must say that the neighboring URLs that low in the results are definitely "BAD NEIGHBORHOOD". In fact, it's downright embarrassing to be listed among them!
Google is not winning any brownie points with people when things like this happen. I think in some of the niche areas, Google's lack of understanding of the various related search terms, how things meld together, and how seemingly similar things may not be similar at all in the real world, is beginning to show Google's shortcomings.
For instance if you have a site about widgets, with sections for widgets made in 1903, and there are pages for each year, make, model, size, etc., while in an algorhythmic way on the surface 1902, 1904, 1913, even 2003, etc., might seem similar, they are in the real world very, very different. Just the fact that one number changes in a model number can make all the difference in the world. And when you aren't actually SELLING widgets on this particular site, one has to wonder how such a site could be filtered so drastically. Yes, an affiliate link or two is on the site for people to buy widget-related stuff, but most of the content pages have no ads whatsoever on them. 90%+ of the pages on the site have nothing on them that provide income. What the heck is Google filtering?
Google needs to face the fact that not everyone is out to beat them: some of us just want others to be able to find our site when they search for terms our site deals with. I don't have to be #1, but I sure as heck don't feel like I should have to accept being at #755, or #813, or whatever all the time, especially when an eBay auction selling something that ended a week ago is listed above my pages, or when an ad on a national for sale site ranks above mine, and the item has already sold. That isn't relevancy.
So, instead of adding new, original content to my site, here I am futzing with the HTML, afraid to do too much, afraid to do too little, and not at all sure that what I'm doing is fixing the problem in the first place.
I don't think it should be all that hard for Google to determine whether a site is spam or not: if there aren't any ads on 90% of the pages, if there's only one site owned by that person or company, and if the domain is old and paid for years in advance, chances are pretty good that site isn't spamming anyone, including Google. And chances are also quite good that the person who owns that penalized site has no idea they've even done anything wrong. Yet with so little traffic, they don't continue developing the site because they feel no one else is interested in it. So, a site that could have been great stagnates and dies a slow death.
Meanwhile, people are searching Google for terms that that site can address, and they aren't finding it listed. If they go to Yahoo, MSN, Ask Jeeves, Dogpile, etc., they will find it. But only if they look for it there. And with almost half of the search traffic, Google could very well be disappointing a huge percentage of their customers.
In an attempt to deal with spammers, Google has forgotten why people are using Google in the first place: to find relevant sites. Having relevant sites listed at #755 in the SERPs isn't going to help anyone.
(End of rant - sorry, but I'm very frustrated. I just saw a thread on a subscription only online mailing list about how there isn't any information out there about so and so, the person spent 2 hours looking for it on Google over the weekend, and couldn't find anything. Well, it's on my site, the page it's been on has been online since 2001, and for 4 years it was #1. Now it's buried. And I don't know why. That page could have saved the guy a couple of hours, because the info he needed was readily available, and answered his question and solved his problem in under 5 minutes. But that isn't relevancy in Google's world...they'd rather show a bunch of junk to the guy instead, none of it answering his question, none of it solving his problem. Repeat several million times a day, and you get an idea of the user experience for many people.)
| 8:49 pm on Jan 17, 2007 (gmt 0)|
I'm agree with you! Look Jan 14 all my pages (around 800 pages) disappear from Google, today I am looking again my pages on Google. I will see if this is temporally :(
PS: I did not anything.......
| 9:37 pm on Jan 17, 2007 (gmt 0)|
I'm not too concerned as I think google will realize in the long or short term that what they've done with this apparent tinkering is not going to lead to a better user experience, and is not going to sell more Adwords.
For one, many of our Adwords are only borderline profitable on their own. But for others that have the reenforcement effect of us appearing in the natural SERPs, our conversion ratio is good. Apparently people click on your ad, and then quickly scan and look for you in the natural results to see if you only bought your way in or if you are really an authororitive site for their search query.
Thank G-d we have lots of news articles and posts about us listed in the natual results, which dampens the effect of this. But honestly, we now probably going to invest more in advertising at Y & MSN, and since we're listed in the SERPs there, we should see good results.
| 10:24 pm on Jan 17, 2007 (gmt 0)|
Pages without the remotest seo are hit just like seo'ed pages so that idea goes nowhere.
Google has long had, well, call it "this domain can't rank well for this search term" for some stemmed word combinations. That could be the pre-history of the phenomenon, but it is defintely a different thing.
Now, since this has gone on so long, it's possible to have targeted a term with content on a topic that has been placed in unique, rewritten form on a series of URLs, all of which rank great immediately, and tank at the next data refresh.
Stemming, posion pill keywords especially in URL, density of poison words (a completely different thing than standard keyword density)... I don't know what is being inappropriately recognized, but I do know that all four pages on a topic I've put on a particular domain over the past many months have been penalized, while only 1% of the other pages on this domain are commonly penalized (1000+ are not).
I think there are many somewhat similar (but also extremely different) phenomenon occuring, so it is big mistake to try and categorize them the same. The bottom line is in some cases Google interprets its data badly. You can switch the data around, but this is primarily Google doing a bad job in this area.
| This 201 message thread spans 7 pages: < < 201 ( 1 2  4 5 6 7 ) > > |