homepage Welcome to WebmasterWorld Guest from 54.197.94.241
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 193 message thread spans 7 pages: < < 193 ( 1 2 [3] 4 5 6 7 > >     
Is Google Using a Position #6 "Penalty"? - part 2
forcer




msg:3539360
 10:57 am on Jan 3, 2008 (gmt 0)

< continued from: [webmasterworld.com...] >

Hello guys,

One of my sites got hit.

Background information

1. One year old website
2. Niche terms with low competition and been number #1 for 2 terms for more than 6 months

In mid december my #1 got to around #6 position but fluctuating sometimes back and sometimes around #6 and now got stuck on #6

Conditions

* I have keyword in the domain - e.g. www.keyword.net and that term got hit (+ some deep pages optimized for terms)
* The site is misspelling site - the site is ranking on mispellings of very competitive words. On these misspellings there is very low competiton and mostly forums/old sites which are not optimized for the misspelling at all.
* The site was entirely ranked on SEO. No PPC budget and no brand recognition
* Site was still getting some back links but the quality could be questionable - paid links but relevant
* All 3 terms that I was ranking for had lots of links with the same anchor texts and only small variations were present
* All the traffic went down, not only these 3 terms. Also my brand name - which is generic name ranks on #6
* I am using Google Analytics and other google products heavily. The site was interlinked with other of my sites but these have not been penalized.
* The homepage was changing constantly in last months and there have been relevant outgoing links to my other sites, which have not been hit.
* One of the deep pages that got hit, have been redesigned about 2-3 weeks before it got hit, with new content and template

[edited by: tedster at 6:05 pm (utc) on Jan. 5, 2008]

 

cheesy snacks




msg:3544203
 6:01 pm on Jan 9, 2008 (gmt 0)

some excellent points.

my site is over 5 years old and now ranks #6 also.

I like the thinking on here, some really good points made.

I'm trying to figure out why we have dropped and have read through alot of theories.

In my situation, when I got my one way backlinks (sometimes paid), I didnt just get links to my homepage, but also to specific 'product' page.

Im just thinking that perhaps I got too many links to this page.

Perhaps google is thinking 'Well now his homepage may not be the most important page on the site...lets put more emphasis on this 'product' page ie. shift that page up and give less weighting to his homepage'. ie. which results in a drop down to #6.

Finally yes I also rank #6 allins. Another line already mentioned here is that google is giving much less weighting to old links ie. 3+ years...so perhaps its just a case of us going down the old route of getting more inbound links than our competitors?

ChiefBottleWasher




msg:3544212
 6:06 pm on Jan 9, 2008 (gmt 0)

My home page is badly affected by the position six penalty.

It has a single external link to the SSL certificate provider with a rel=nofollow on the link added since there was some speculation that outlinks on the affected page may be the issue.

Some of my back pages with content on also seem to be showing up at position six for their targetted terms. There are far too many examples of this to mean that this is a co-incidence. There are no external links at all on these back pages.

The overall traffic impact on our site is about 150 lost google referrals per day from a previous total of about 450. Some SEO work that I have done has mitigated this with about 100 of those lost visits compensated by improved traffic on MSN and Yahoo, but this is still hurting us, My plan was to be plus 200 visits a day in Jan from Nov but we are actually about minus fifty since Nov.

cheesy snacks




msg:3544217
 6:16 pm on Jan 9, 2008 (gmt 0)

actually i just checked my keyword density after taking a look at phrasing and my site has the least keyword density for my keywords in the top 6.

my site has 1.19%
while others have between 2-4%

Can somebody please just reassure me that their site which lies at #6 has a higher keyword density than some of the sites above it?

Im gonna increase my keywords on my homepage anyways and get it above 2%

tedster




msg:3544222
 6:21 pm on Jan 9, 2008 (gmt 0)

google is giving much less weighting to old links ie. 3+ years

True, but why so many sites going to exactly #6 from #1 at the same time? If it was just a weighting change for aging backlinks, we'd expect to see other positions involved, too, no?

ChiefBottleWasher




msg:3544223
 6:23 pm on Jan 9, 2008 (gmt 0)

A Eureka moment, I hope/think!

As you've observed if you've read all the postings here, I'm badly affected by the issues described here.

Here's a theory.

I originally worked on a client site in a product space which was new to me and an associate of mine collected a large number of high quality original links for that site and they were placed in a well organised and carefully edited link directory.

That site has held and remains at the number one spot for its targetted phrase.

Subsequently we were distressed to observe that a competing site which had our link directory pretty much copied verbatim along with some extra link work of its own overtook us for the phrase in May of this year. I'm sure you'll realise that this is an effect caused by something that we regarded as "spamming". We would have hoped that search engines would have done something to prevent this type of activity. The site remains out-ranked on MSN by this "me too" link directory competitor.

I have identified no fewer than three web sites targetting our original phrase which were all promoted by the same SEO firm using close copies of our link directory that all have this position six filter applied.

We re-used our own effective original link directory in-house and three of those follow up sites also have position six penalties. That means that an authority site has kept its number one rankings but no fewer than six web sites have been observed with a position six penalty for copy-cat link directories.

The resolution therefore is to build a high quality diverse link directory using original thinking to get your links. Using other sites' link directories for prospective backlinks as a source for your linking efforts is fine but you need to avoid taking the authority site's link directory verbatim, or very nearly so because it can't be used to effectively duplicate the same results any more.

I'd welcome any comments, but I've been doing this six years and this theory is in my opinion rock solid. From the link building observations that I've heard here there are other sites that have been affected that have the copycat link directory phenomenon.


tedster




msg:3544253
 6:43 pm on Jan 9, 2008 (gmt 0)

I know of several affected sites that have NO outbound links at all, say nothing of a link directory. There are similar reports in this thread from others as well. So you may be on to something you need to address for other reasons, but I don't think it accounts for the #6 phenomenon.

ChiefBottleWasher




msg:3544259
 6:51 pm on Jan 9, 2008 (gmt 0)

I thought this would kick some dust up!

The issue is not to do with outbound links, it is the inbound linking profile, if that matches an authority site too closely then you have troubles. You'd need to look at a specific example in depth to diagnose the problem.

[edited by: tedster at 7:06 pm (utc) on Jan. 9, 2008]

ChiefBottleWasher




msg:3544265
 7:02 pm on Jan 9, 2008 (gmt 0)

Wikipaedia has a lot of inlinks from ODP/DMOZ.ORG.

It's my opinion that those links are being devalued in recent updates. Many don't show up in baclink summaries where they did. This may account for wikipaedia disappearances seeeming to be contemporary with position six downgrades.

rros




msg:3544266
 7:03 pm on Jan 9, 2008 (gmt 0)

The site I speak of was built on Jan 2000, 8 years old.

Wikipedia became #1 and replaced the tableless page with new title I described earlier sometime last year. The now 6'd page became #2 when that happened and a few times I saw it also at #3. Searching today, wikipedia is still #1. Nothing was done to alter the natural results after wikipedia took over. It was just fine.

This site does have a directory integrated within but the script that runs it was built from scratch in perl upon original specifications. The Directory was built manually over a period of many years and all 6k+ submissions were approved (or not) individually. The first few years it was suggested to link back as a courtesy, for the purpose of exchanging traffic. Then, this practice became optional and there were a good number of site owners never returning links and from those who did, they would chose how to link using their own criteria. So the site has never used anyone's backlinks or copied anyone's directory structure and allow for diverse linking (some linked using the description, like 2 sentences). This is in reponse to CBW directory theory.

cheesy snacks




msg:3544287
 7:28 pm on Jan 9, 2008 (gmt 0)

I was ranked #1 with NO link directory. I obtained natural one ways due to good content.

I only have 2 recips from my homepage.

Also I was #1 and didnt have a DMOZ ODP submission.

Whereas the sites behind me were included in the directory.

Now I am #6. mmmmm

This is good stuff tho people....we are getting closer! We will find the answer!

Can somebody answer my questions just to put my mind at rest:
1)What keyword density for their main keyword they have on their homepage..is it more than 1.9%
2)Using google webmaster tools, I have alot of URLS 'resticted by robots', which are my affiliate links. Do other webmasters have high numbers of these restricted URLS?

tedster




msg:3544306
 7:49 pm on Jan 9, 2008 (gmt 0)

Keyword density is not a metric that Google uses, at least not directly. It can be a somewhat informative tool for looking at a page in some cases, but the issue is that there is not even a standard definition. The idea of kw density is pretty much a leftover from the '90s and the old WPG approach to SEO. Today's algo just doesn't submit to that kind of reverse engineering.

That's not to say that semantic factors can be ruled out - phrase based indexing [webmasterworld.com] term co-occurence [webmasterworld.com] and so on.

With regards to robots.txt, the examples I'm watching tend to have a very low number of rules.

ChiefBottleWasher




msg:3544418
 10:48 pm on Jan 9, 2008 (gmt 0)

Whenever Google throws up some penalty everyone gets very defensive and self-pitying and tends to outline all the wonderful righteous things that they've done. There is some of this going on in recent posts on this thread.

If you are a really good commercially driven SEO then you do whatever it takes to get top-ranked. Sometimes if a short cut works, then you'll take it. I consider myself an expert in what Google likes and back in the day I've done things to incur a penalty. I've seen astonishing results by posting run of site cross links, I've hidden links away in places they really shouldn't have been and as a result my clients have made money.

This is the real world, we're in business. If it works and it cuts the mustard and earns us a fee then we tend to do it.

This discussion thread is the only one on the Internet devoted to this topic and elsewhere there's much dismissal and holier than thou commentary on what has occurred and denial of the expert opinions promulgated here along with aloof and condescending commentary about the causes of peoples' plight.

Google is busting sites down to six, fact! This discussion is *the* cutting edge SEO debate right now, because to be a talented SEO then you need to be analysing Google's tech team actions right now, where they affect business and this penalty is hitting a lot of businesses and a lot of people are scratching their heads.

The absolute bottom line here is that if you are hit by this number six position penalty then you have done something that Google doesn't like. Forget glitches and testing. This has been hitting the money sites for a month now. In the eyes of Google, if you're suffering at six then you are a wrong-doer and someone else above you has more right to be there.

The moment that you become humble enough to realise that your site has jumped the queue, in the eyes of Google's people - who are pretty smart - then you're a step closer to fixing up and climbing up the SERPS.

This is a penalty on link building techniques. On page stuff is easy to analyse and discount. The exact nature of the link profile becomes harder and harder to fathom, but that is the beauty of this job, it's never the same one month to the next. There's a natural way to acquire lnks and Google has created an algorithm that penalises sites that don't conform to that ideal.

If you're at number six and you weren't then you didn't do what they like.

crobb305




msg:3544434
 11:17 pm on Jan 9, 2008 (gmt 0)

If you're at number six and you weren't then you didn't do what they like.

Well there ya go. The debate is over. LOL

potentialgeek




msg:3544453
 11:32 pm on Jan 9, 2008 (gmt 0)

I wonder if Google has ever thought of testing #1 positioned sites.

"Does it deserve to be #1?"

"Did it do any tricks to get there?"

"Let's knock 'em down a few pegs and see how they react."

"If they try SEO schemes to get back up, they probably schemed to get to #1."

Operation Smoke 'Em Out.

Not an unreasonable testing system for sites that have new or old trust issues. Almost no SEO-manipulating webmaster does nothing after a SERP drop.

Another good idea which somebody touched on earlier in this thread is SERP Shuffling. Shuffling the top 5 results randomly
(when their value is perceived to be very close) to get webmasters to give up fiddling and scheming.

That would be a fun paradigm shift!

p/g

tedster




msg:3544469
 11:58 pm on Jan 9, 2008 (gmt 0)

Here's an anecdote. One site that was under position #6 for a search phrase acquired a new backlink using a different search phrase as anchor text - solid link from a solid site. The url bounced back to #1 within a few days.

Of course, one story does not mean that cause and effect are pinned down, but I thought it was worth sharing.

Timetraveler




msg:3544697
 7:53 am on Jan 10, 2008 (gmt 0)

Tedster- Was only the homepage under the #6 filter? Also was the link with different anchor text leading to the homepage? One last thing, were there other affected pages or just the homepage and if so, did they come out of it?

donnajean




msg:3544846
 1:10 pm on Jan 10, 2008 (gmt 0)

Early on I got a solid backlink with different anchor text to the homepage, no success. I have a few pages affected not just homepage.

AjiNIMC




msg:3544689
 7:27 am on Jan 10, 2008 (gmt 0)

Compilation of #6 penalty - A different thought

Ok, I admit that I am visiting WebmasterWorld after some really long months (as was working on community requirement at detailed level than SE), so sorry if the things are getting repeated.

Expert crowd wisdom about #6 penalty says:

  1. It certainly exists as too many people can't experience randomness.
  2. Sites getting affected are well established sites with a long history and were long time good rankings for a big search term - usually #1
  3. It is search term specific.
  4. Crowd reasoning:
    • Anchor text diversification problem.
    • Backlink signaling problem. Changes in # and quality of incoming links.
    • Please add other simulated reasons.

A different thought about #6 penalty

  • Since it is for #1 (top and established) sites, it can be an experiment to see change on user experience. May be Google want to see the effect of promoting new players up to see the effect on user experience (and may be on their ads too).Is it something to do with weighted performance? (term "weighted performance" explained at the end of the post)
  • Since Google is keeping them on first page, we can say that the suffered sites are somewhat with Google's theme "Do no evil" (or at least don't get caught).
  • Since other low ranking sites are not getting affected, we can say it is not a general filter. (Else why won't Google not penalized a low ranked site for the same crime?)

Some food for thought for #6 penalty:
Is it about Weighted performance? Let's take an example (Clicks for 100 visitors for a search term):

  • 40% clicks to #1 site (Expected % of clicks at #1: 60%, Weighted performance: = -20%)
  • 27.5% clicks to #2 site (Expected % of clicks at #2: 20%, Weighted performance: = +7.5%)
  • 16% clicks to #3 site (Expected % of clicks at #3: 10%, Weighted performance: = +6%)
  • 12.5% clicks to #4 site (Expected % of clicks at #4: 8%, Weighted performance: = +8.5%)
  • 4% clicks to #5 site. (Expected % of clicks at #5: 2%, Weighted performance: = +2%)

Google may be want to demote the top sites with low weighted performance for considerably longer period.

[edited by: tedster at 4:22 pm (utc) on Jan. 10, 2008]

tedster




msg:3545011
 4:25 pm on Jan 10, 2008 (gmt 0)

If I get your drift, your idea is that perhaps the issue is that other sites in the top 5 are performing better than expected and the algo is giving them a boost for that reason? Ineteresting idea. I'm not sure why this would always result in a #6 position, but I'll keep it in mind.

AjiNIMC




msg:3545038
 4:44 pm on Jan 10, 2008 (gmt 0)

I'm not sure why this would always result in a #6 position, but I'll keep it in mind.

Just a thought, keeping the other five above the fold for all resolutions. The first look shows the top 5, may be because of that. I am just guessing it with some gut feel.

If these sites were bad, then a -5 penalty (not exactly -5 but position #6 penalty) doesn't solve anything. Looks like an experiment to increase user experience. We are far better than our competitors in all terms and enjoyed #1 position for over 3 years now. It looks like nobody can beat us except when Google favors.

Can we share the user experience analytics to check it up?

donnajean




msg:3545056
 4:59 pm on Jan 10, 2008 (gmt 0)

I was #2 and while I don't know my competitors numbers, my site performed very well. I have a "tool" on my homepage that others do not and it has been very popular with the type of buyer that we have.

If I estimate and project with round numbers at the end of this month being at position #6 I will have 500 hits for one competitive keyword. At position #2 for Nov we had 900 and for Dec we had 700 for this same keyword.

My bounce rate is considered low too.

cheesy snacks




msg:3545082
 5:11 pm on Jan 10, 2008 (gmt 0)

just to throw another idea...

do people have multiple sites seo'd for the same keyword?

If so are those sites interlinked?

I only say because I know a webmaster who optimised 3 or 4 sites for one particular 'big money' keyword...one of those sites was top 3 for a year...then suddenly it was hit by a -950 pen. month later my site dropped to #6. I only interlinked 2 sites but they were both targetting the same keyword phrase.

Anyone else seen the same?

AjiNIMC




msg:3545084
 5:14 pm on Jan 10, 2008 (gmt 0)

My bounce rate is considered low too.

It may be just an experiment, "Kill all kings" to see the effect :).

donnajean




msg:3545112
 5:35 pm on Jan 10, 2008 (gmt 0)

Don't get (at least for me) the kill all kings theory - the "KING" that was above above me in the #1 position is 10 - 20 times more kingier than me. They are still sitting pretty.

In response to the network theory. Yes, I have multiple sites going after similar and some indentical keywords. They are on different IPs but in the same GA account. Yes, they were reciprocally interlinked - have since dropped the recip and have one way links now (affected site to untouched sites). Considering dropping all links now yet I have been told by others, it is not a big deal. But not worth taking a chance over.

rros




msg:3545114
 5:36 pm on Jan 10, 2008 (gmt 0)

cheesy snacks, only 1 site.

AjiNIMC has an interesting theory but would not quite explain the new page with the newest title indexed/cached last week that would automatically show up at #6 when searched for the new title. Your theory relies on crunching numbers and this may take some time.

Miamacs




msg:3545116
 5:36 pm on Jan 10, 2008 (gmt 0)

How about the regionality of the backlink profile?
I only have a single 'suspect' site/phrase ( can't say for 100% this is the problem ), but that's about the only thing it could cause this on there.

Do these #6 results show as #6 on all regional sites? Like Google.com, .ca, .co.uk, .wherever / and or checking Google.com in NY, LA, London or Vancouver?
How about querying Google.com with other languages as the default?

...

I mean links with the given competitive phrase that are otherwise OK, but seem to originate from a different region may be viewed as an irregularity. The entire *site* could have a lot of inbounds with *other phrases* from its own region, thus holding it at its original positions for those, but for stuff that it was linked with only ( or mostly ) from another region/country ( even if in the same English language )... it could be it was demoted on a previously unseen scale (?)

...

pensfan




msg:3545117
 5:39 pm on Jan 10, 2008 (gmt 0)

Since it is for #1 (top and established) sites, it can be an experiment to see change on user experience.

AjiNIMC -

I can tell you that our site is certainly hit with the #6 "penalty" yet we were no longer top 5 for the main keyword. We were #1 for about 4+ years, right up until Jan 2007. Since then we've been not any higher than #3 for a short period.

When the #6 issue started we were at #7.

AjiNIMC




msg:3545136
 5:54 pm on Jan 10, 2008 (gmt 0)

What about other sites in the SERP for the same keyphrase, did they also see a (+ive or -ive) change?

whitenight




msg:3545141
 5:59 pm on Jan 10, 2008 (gmt 0)

As said above, "weighted performance"/"click thru rates" theories may in fact be what's happening.

Ok, so now how do we test this?!

It's effectively a NON-theory for the purposes of this discussion.
It can't be tested. It can't be verified in comparison to other sites "performance"/bounce rates/whatever.
And it's actually counter-intuitive to some of the data we're hearing.

ie. how does Goog ever evaluate when to "RAISE" the weighted performance of a page stuck at #6?

Why aren't we seeing sites "rotated" for ALL the pages of the specific SERPS?

When, how, and why is/was Goog testing this new "user-data"?
There would certainly be more of cookie-crumbs of hints, testing, beta roll-outs, etc to point to such a huge radical shift to how they present SERPs.

This takes huge number-crunching, analysis, disk-space, (and most importantly Public relations marketing), etc that Google simply hasn't shown they are engaged in. Let alone to implement it in a matter of a few weeks?

Heck, they're still having issues rolling out PR uniformly and getting Universal Search implemented and suddenly they are doing this?!

Doesn't sound probable, nor likely.

jimbeetle




msg:3545179
 6:40 pm on Jan 10, 2008 (gmt 0)

This takes huge number-crunching, analysis, disk-space...etc that Google simply hasn't shown they are engaged in. Let alone to implement it in a matter of a few weeks?

Google engineers are encouraged to think BIG. As far as G is concerned, new hires don't "get it" until they request 20 or 30 thousand machines for a project. Coupled with its fabled operating system, capacity is not one of its problems.

As for click or other usage data being a non-theory because we can't prove it, well, I'm comfortable enough with that. There's much we can't prove in what we think we know about how the SEs operate. I initially brought up click data in the first thread as a throw-away possibility. The more I read and think about it, the more I *feel* it might be close to the right track.

whitenight




msg:3545190
 6:54 pm on Jan 10, 2008 (gmt 0)

You missed the most important part. ie Public relations

Do you realize how big of a difference this is in Search engine development?!

Google is a Fortune 500 company that has alot of people to think about outside of engineers "pet projects".

If they were implementing it -- In any shape or form live -- their marketing department would have "prepped" Wall street for any positive rush or negative press they would receive.

Not a year ago, Eric was bemoaning the lack of storage space Goog had and explaining the need for more storage space (aka gross expenses) to investors.

Yes, this is what Goog is working towards 3-5 years from now.
I repeat, in NO way, have they shown they are even near implementing this NOW.

Now back to the discussion.

Just because this issue is frustrating and doesn't show "classic" symptoms doesn't mean everyone gets to say the
"boogie man"
"aliens from space"
or "Google has jumped 3 years ahead in analytics"
"and therefore it's beyond my control and I can't do anything to fix it"...

Feel free to claim the above, but until someone shows me even a smidgen of real proof Google has or can implement this,
then we would all be wise to conclude it falls under the normal way Google has been operating for the past 4 years. Links, spam, links, or something to do with.....(wait for it).....links.

This 193 message thread spans 7 pages: < < 193 ( 1 2 [3] 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved