Welcome to WebmasterWorld Guest from 18.104.22.168
One of my sites got hit.
1. One year old website
2. Niche terms with low competition and been number #1 for 2 terms for more than 6 months
In mid december my #1 got to around #6 position but fluctuating sometimes back and sometimes around #6 and now got stuck on #6
* I have keyword in the domain - e.g. www.keyword.net and that term got hit (+ some deep pages optimized for terms)
* The site is misspelling site - the site is ranking on mispellings of very competitive words. On these misspellings there is very low competiton and mostly forums/old sites which are not optimized for the misspelling at all.
* The site was entirely ranked on SEO. No PPC budget and no brand recognition
* Site was still getting some back links but the quality could be questionable - paid links but relevant
* All 3 terms that I was ranking for had lots of links with the same anchor texts and only small variations were present
* All the traffic went down, not only these 3 terms. Also my brand name - which is generic name ranks on #6
* I am using Google Analytics and other google products heavily. The site was interlinked with other of my sites but these have not been penalized.
* The homepage was changing constantly in last months and there have been relevant outgoing links to my other sites, which have not been hit.
* One of the deep pages that got hit, have been redesigned about 2-3 weeks before it got hit, with new content and template
[edited by: tedster at 6:05 pm (utc) on Jan. 5, 2008]
my site is over 5 years old and now ranks #6 also.
I like the thinking on here, some really good points made.
I'm trying to figure out why we have dropped and have read through alot of theories.
In my situation, when I got my one way backlinks (sometimes paid), I didnt just get links to my homepage, but also to specific 'product' page.
Im just thinking that perhaps I got too many links to this page.
Perhaps google is thinking 'Well now his homepage may not be the most important page on the site...lets put more emphasis on this 'product' page ie. shift that page up and give less weighting to his homepage'. ie. which results in a drop down to #6.
Finally yes I also rank #6 allins. Another line already mentioned here is that google is giving much less weighting to old links ie. 3+ years...so perhaps its just a case of us going down the old route of getting more inbound links than our competitors?
It has a single external link to the SSL certificate provider with a rel=nofollow on the link added since there was some speculation that outlinks on the affected page may be the issue.
Some of my back pages with content on also seem to be showing up at position six for their targetted terms. There are far too many examples of this to mean that this is a co-incidence. There are no external links at all on these back pages.
The overall traffic impact on our site is about 150 lost google referrals per day from a previous total of about 450. Some SEO work that I have done has mitigated this with about 100 of those lost visits compensated by improved traffic on MSN and Yahoo, but this is still hurting us, My plan was to be plus 200 visits a day in Jan from Nov but we are actually about minus fifty since Nov.
my site has 1.19%
while others have between 2-4%
Can somebody please just reassure me that their site which lies at #6 has a higher keyword density than some of the sites above it?
Im gonna increase my keywords on my homepage anyways and get it above 2%
As you've observed if you've read all the postings here, I'm badly affected by the issues described here.
Here's a theory.
I originally worked on a client site in a product space which was new to me and an associate of mine collected a large number of high quality original links for that site and they were placed in a well organised and carefully edited link directory.
That site has held and remains at the number one spot for its targetted phrase.
Subsequently we were distressed to observe that a competing site which had our link directory pretty much copied verbatim along with some extra link work of its own overtook us for the phrase in May of this year. I'm sure you'll realise that this is an effect caused by something that we regarded as "spamming". We would have hoped that search engines would have done something to prevent this type of activity. The site remains out-ranked on MSN by this "me too" link directory competitor.
I have identified no fewer than three web sites targetting our original phrase which were all promoted by the same SEO firm using close copies of our link directory that all have this position six filter applied.
We re-used our own effective original link directory in-house and three of those follow up sites also have position six penalties. That means that an authority site has kept its number one rankings but no fewer than six web sites have been observed with a position six penalty for copy-cat link directories.
The resolution therefore is to build a high quality diverse link directory using original thinking to get your links. Using other sites' link directories for prospective backlinks as a source for your linking efforts is fine but you need to avoid taking the authority site's link directory verbatim, or very nearly so because it can't be used to effectively duplicate the same results any more.
I'd welcome any comments, but I've been doing this six years and this theory is in my opinion rock solid. From the link building observations that I've heard here there are other sites that have been affected that have the copycat link directory phenomenon.
The issue is not to do with outbound links, it is the inbound linking profile, if that matches an authority site too closely then you have troubles. You'd need to look at a specific example in depth to diagnose the problem.
[edited by: tedster at 7:06 pm (utc) on Jan. 9, 2008]
Wikipedia became #1 and replaced the tableless page with new title I described earlier sometime last year. The now 6'd page became #2 when that happened and a few times I saw it also at #3. Searching today, wikipedia is still #1. Nothing was done to alter the natural results after wikipedia took over. It was just fine.
This site does have a directory integrated within but the script that runs it was built from scratch in perl upon original specifications. The Directory was built manually over a period of many years and all 6k+ submissions were approved (or not) individually. The first few years it was suggested to link back as a courtesy, for the purpose of exchanging traffic. Then, this practice became optional and there were a good number of site owners never returning links and from those who did, they would chose how to link using their own criteria. So the site has never used anyone's backlinks or copied anyone's directory structure and allow for diverse linking (some linked using the description, like 2 sentences). This is in reponse to CBW directory theory.
I only have 2 recips from my homepage.
Also I was #1 and didnt have a DMOZ ODP submission.
Whereas the sites behind me were included in the directory.
Now I am #6. mmmmm
This is good stuff tho people....we are getting closer! We will find the answer!
Can somebody answer my questions just to put my mind at rest:
1)What keyword density for their main keyword they have on their homepage..is it more than 1.9%
2)Using google webmaster tools, I have alot of URLS 'resticted by robots', which are my affiliate links. Do other webmasters have high numbers of these restricted URLS?
With regards to robots.txt, the examples I'm watching tend to have a very low number of rules.
If you are a really good commercially driven SEO then you do whatever it takes to get top-ranked. Sometimes if a short cut works, then you'll take it. I consider myself an expert in what Google likes and back in the day I've done things to incur a penalty. I've seen astonishing results by posting run of site cross links, I've hidden links away in places they really shouldn't have been and as a result my clients have made money.
This is the real world, we're in business. If it works and it cuts the mustard and earns us a fee then we tend to do it.
This discussion thread is the only one on the Internet devoted to this topic and elsewhere there's much dismissal and holier than thou commentary on what has occurred and denial of the expert opinions promulgated here along with aloof and condescending commentary about the causes of peoples' plight.
Google is busting sites down to six, fact! This discussion is *the* cutting edge SEO debate right now, because to be a talented SEO then you need to be analysing Google's tech team actions right now, where they affect business and this penalty is hitting a lot of businesses and a lot of people are scratching their heads.
The absolute bottom line here is that if you are hit by this number six position penalty then you have done something that Google doesn't like. Forget glitches and testing. This has been hitting the money sites for a month now. In the eyes of Google, if you're suffering at six then you are a wrong-doer and someone else above you has more right to be there.
The moment that you become humble enough to realise that your site has jumped the queue, in the eyes of Google's people - who are pretty smart - then you're a step closer to fixing up and climbing up the SERPS.
This is a penalty on link building techniques. On page stuff is easy to analyse and discount. The exact nature of the link profile becomes harder and harder to fathom, but that is the beauty of this job, it's never the same one month to the next. There's a natural way to acquire lnks and Google has created an algorithm that penalises sites that don't conform to that ideal.
If you're at number six and you weren't then you didn't do what they like.
"Does it deserve to be #1?"
"Did it do any tricks to get there?"
"Let's knock 'em down a few pegs and see how they react."
"If they try SEO schemes to get back up, they probably schemed to get to #1."
Operation Smoke 'Em Out.
Not an unreasonable testing system for sites that have new or old trust issues. Almost no SEO-manipulating webmaster does nothing after a SERP drop.
Another good idea which somebody touched on earlier in this thread is SERP Shuffling. Shuffling the top 5 results randomly
(when their value is perceived to be very close) to get webmasters to give up fiddling and scheming.
That would be a fun paradigm shift!
Of course, one story does not mean that cause and effect are pinned down, but I thought it was worth sharing.
Ok, I admit that I am visiting WebmasterWorld after some really long months (as was working on community requirement at detailed level than SE), so sorry if the things are getting repeated.
Expert crowd wisdom about #6 penalty says:
A different thought about #6 penalty
Some food for thought for #6 penalty:
Is it about Weighted performance? Let's take an example (Clicks for 100 visitors for a search term):
[edited by: tedster at 4:22 pm (utc) on Jan. 10, 2008]
I'm not sure why this would always result in a #6 position, but I'll keep it in mind.
If these sites were bad, then a -5 penalty (not exactly -5 but position #6 penalty) doesn't solve anything. Looks like an experiment to increase user experience. We are far better than our competitors in all terms and enjoyed #1 position for over 3 years now. It looks like nobody can beat us except when Google favors.
Can we share the user experience analytics to check it up?
If I estimate and project with round numbers at the end of this month being at position #6 I will have 500 hits for one competitive keyword. At position #2 for Nov we had 900 and for Dec we had 700 for this same keyword.
My bounce rate is considered low too.
do people have multiple sites seo'd for the same keyword?
If so are those sites interlinked?
I only say because I know a webmaster who optimised 3 or 4 sites for one particular 'big money' keyword...one of those sites was top 3 for a year...then suddenly it was hit by a -950 pen. month later my site dropped to #6. I only interlinked 2 sites but they were both targetting the same keyword phrase.
Anyone else seen the same?
In response to the network theory. Yes, I have multiple sites going after similar and some indentical keywords. They are on different IPs but in the same GA account. Yes, they were reciprocally interlinked - have since dropped the recip and have one way links now (affected site to untouched sites). Considering dropping all links now yet I have been told by others, it is not a big deal. But not worth taking a chance over.
Do these #6 results show as #6 on all regional sites? Like Google.com, .ca, .co.uk, .wherever / and or checking Google.com in NY, LA, London or Vancouver?
How about querying Google.com with other languages as the default?
I mean links with the given competitive phrase that are otherwise OK, but seem to originate from a different region may be viewed as an irregularity. The entire *site* could have a lot of inbounds with *other phrases* from its own region, thus holding it at its original positions for those, but for stuff that it was linked with only ( or mostly ) from another region/country ( even if in the same English language )... it could be it was demoted on a previously unseen scale (?)
Since it is for #1 (top and established) sites, it can be an experiment to see change on user experience.
I can tell you that our site is certainly hit with the #6 "penalty" yet we were no longer top 5 for the main keyword. We were #1 for about 4+ years, right up until Jan 2007. Since then we've been not any higher than #3 for a short period.
When the #6 issue started we were at #7.
Ok, so now how do we test this?!
It's effectively a NON-theory for the purposes of this discussion.
It can't be tested. It can't be verified in comparison to other sites "performance"/bounce rates/whatever.
And it's actually counter-intuitive to some of the data we're hearing.
ie. how does Goog ever evaluate when to "RAISE" the weighted performance of a page stuck at #6?
Why aren't we seeing sites "rotated" for ALL the pages of the specific SERPS?
When, how, and why is/was Goog testing this new "user-data"?
There would certainly be more of cookie-crumbs of hints, testing, beta roll-outs, etc to point to such a huge radical shift to how they present SERPs.
This takes huge number-crunching, analysis, disk-space, (and most importantly Public relations marketing), etc that Google simply hasn't shown they are engaged in. Let alone to implement it in a matter of a few weeks?
Heck, they're still having issues rolling out PR uniformly and getting Universal Search implemented and suddenly they are doing this?!
Doesn't sound probable, nor likely.
This takes huge number-crunching, analysis, disk-space...etc that Google simply hasn't shown they are engaged in. Let alone to implement it in a matter of a few weeks?
As for click or other usage data being a non-theory because we can't prove it, well, I'm comfortable enough with that. There's much we can't prove in what we think we know about how the SEs operate. I initially brought up click data in the first thread as a throw-away possibility. The more I read and think about it, the more I *feel* it might be close to the right track.