homepage Welcome to WebmasterWorld Guest from 54.167.10.244
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 161 message thread spans 6 pages: < < 161 ( 1 2 [3] 4 5 6 > >     
Google's 950 Penalty - Part 11
Marcia




msg:3401658
 4:22 am on Jul 23, 2007 (gmt 0)

< continued from: [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

Just saw one 950+ and it does my heart good to see it.

User-agent: *
Disallow: /theirlinkpage.htm

No, I'm not saying that's necessarily why, but it would serve them right if it was, for playing dirty like that on a page that's supposed to have reciprocal links with people exchanging fair and square in good faith.
======================================================
Added:

And another 950+, the last site in the pack. Flash only (not even nice) with some stuff in H1 and H2 elements with one outbound link. class="visible"

<style>
.visible{
visibility:hidden;
}
</style>
=========================================================
Another way down at the bottom is an interior site page 302'd to from the homepage, and isn't at all relevant for the search term - it must have IBLs with the anchor text (not worth the time to check).

Yet another must also have anchor text IBLs (also not worth the time checking) and simply isn't near properly optimized for the phrase.

So that's four:

1. Sneaky
2. Spam
3. Sloppy webmastering
4. Substandard SEO

No mysteries in those 4, nothing cryptic or complicated like some of the other 950+ phenomenon, but it's interesting to see that there are "ordinary" reasons for sites/pages to be 950+ that simple "good practices" and easy fixes could take care of.

The question does arise, though, whether the first two are hand penalties or if somethings's been picked up algorithmically on them - in one case unnatural linking, and in the other, CSS spamming.

[edited by: Marcia at 4:46 am (utc) on July 23, 2007]

[edited by: tedster at 9:13 pm (utc) on Feb. 27, 2008]

 

wanderingmind




msg:3435090
 3:59 am on Aug 29, 2007 (gmt 0)

Just had one of the sites mentioned above by me come back.

Site 950d since beginning of this month I think.

As a first step, I removed a lot of links to deep pages from the homepage. They were links to stories inside, removed 50 % of them.

Pinged google using blogsearch.google.com/ping

Site back to earlier rankings in 24 hours flat!

This was only a first step to see what would happen and then start on checking internal linking and everything else. But this is all it took in my case. Will wait for a few days and see if it stays.

This is an old site which was practically untouched for the last 2 months. Was slowly building up the content with the intention to turn it into a proper content site.

Maybe the age of the site helped tip it over to the good side easily... Another site I have - also 950d - I have done the same and waiting for the first crawl.

wanderingmind




msg:3437414
 8:36 am on Aug 31, 2007 (gmt 0)

And in another 48 hours, rankings go back to the end of SERPs again!

b2net




msg:3439871
 10:52 am on Sep 3, 2007 (gmt 0)

I opened a new website and as expected, it was dropped last in under 48 hours of being initially indexed. During those 48 hours it was ranking normally. At this point I don't know if it is in the classic sandbox or if it is suffering from a 950 penalty due to overoptimizing. The site is in a competitive niche and Google has tightened its filters for these keywords.

The site ranks last for every search including its domain name - the 950 filter many of you are familiar with. I wasn't expecting to get steady rankings from the beginning, I am prepared to wait for some months before getting any free G traffic. I just want to get out of this sandbox or filter as fast as possible. I am a little worried because the same thing happened to some of my new sites earlier and after several months of waiting they are still ranking last.

There's not much I can do about the content. It is basically a one-page javascript application with very little text that can be indexed by spiders. I don't see this as a problem and G shouldn't either.

I am now actively marketing the site which will bring me natural backlinks. I am also adding links to directories etc. These links should increase my Trustrank but will they help me get out of the filter sooner? I hear horror stories of people waiting for a year to get out and then drop back after some weeks.

Miamacs




msg:3439881
 11:12 am on Sep 3, 2007 (gmt 0)

Getting relevant links from relevant sites ( authorities if possible ) is the way to go. That, and the time needed for those links to age beyound the "for all we know this could be spam" period is needed to come out. I'm still bringing out sites one by one with this.

Trust needs to be combined with relevancy signals telling Google *what* the site should be trusted for ( eg. if it's a travel site it's not selling cars ), and by whom ( eg. if it's a travel site, a car dealership's vote won't go all the way ). Collateral comes when accessibility/crawling/HTML/dupe content fallout problems, or all too creative ( irrelevant ) anchor text puts you in a different box at Google than which you were aiming for.

...

Waiting by itself is the biggest mistake for any new website, unless it gets its traffic ( and natural links ) flowing in at a steady pace from somewhere else. But even then, make sure that you have something to at least suggest what anchor text they should use to link to you. A good title ( which is needed anyway ), a motto/slogan with your keywords, the URL with which they link to you including the phrases... etc. You could have more than one motto. And a catchy slogan for every section. Trademarked ( ... )

Well anyway.
For myself I've renamed this the "market protection filter".
Or simply "Get more links". ( and make them good this time )

ps. if you didn't have the trust, you'd be out, not -950.
What you lack is trust that's relevant... or relevancy that's trusted.

...

HoHum




msg:3439969
 1:42 pm on Sep 3, 2007 (gmt 0)

Would it be useful to start a new topic in which people disclose how the got out of the -950 problem including in detail what issues they had with their site?

It would assume they have successfully escaped its grasp for say a month or more as many people, myself included, have thought they'd fixed it for a few days or weeks only to fall back again.

Honesty is needed here! Do you or have you had paid links? Have you bought links? Do you use syndicated content? Any old duplicate content on site? Etc etc.

Miamacs




msg:3440070
 4:25 pm on Sep 3, 2007 (gmt 0)

Do you or have you had paid links?

No...

btw, the correct question is:
Did you have any paid links on your site that were clearer than daylight to have been paid links, as they were irrelevant, used the code of a broker, and or even said that they were sponsored, partner, featured sites... in a box... at the right or the bottom... etc

Have you bought links?

No...

again: did you buy links from pages just because they were selling them or did you seek out sites, on topic, very informative, your dream come true referrals, and asked them if they'd link to you but they said it's gonna cost money?

Do you use syndicated content?

No...

Any old duplicate content on site?

No... not on my sites but assisted with some that had issues.
Classic www, non-www, mirror sites, etc.

Etc etc.

Etc.

1.: New site had 45%+ of its otherwise very high quality backlinks using the same anchor. Great branding.
2.: Site had its strongest backlink(s) using anchor text that were *semantically* irrelevant ( for Google )
3.: Site was never referred to with the exact phrase ( ultra competitive term, while was relevant otherwise )
4.: Site was out of balance because of a link ( see 2. ) to an internal page ( much higher PR, not even the canonical URL, sometimes anchor text used the URL which wasn't packed with keywords )

Not mine, but assisted with:

- Too many phrases targeting too many marketing sensitive stuff on a single page ( semantically unrelated )
- Unrecognized abbreviations ( site was thought to target a typo of another word instead of the short form of a phrase )
- Inconsistent navigation ( links to dead pages, links to pages that redirected to another page )
- Errorous redirect breaking navigation, pages dropping out, leaving some "orphaned" in the Google index.
- Site had navigation using a word that was monitored for an entirely other theme

...

- Some approached me with why their spam is -950. I'd like to quote what Matt Cutts is said to have said when asked about "the sandbox effect" ( new sites, low trust, not appearing in Index ):

" Ok, it works then. "
( paraphrased )

Spammers lurking, finding this page through an SE. Using off-topic, low trust, low relevance, low everything sites/pages in the bulk to ram your sites into the index does not work anymore. Link buyers who still don't get it. Using off topic, at the side, in the footer, irrelevant links bought from someone selling links at a broker for PR does not work anymore. That's what this filter is for. Dear everyone else - like me - watching your sites fall:
The filter is 'not perfect'.

I've posted about 100k of text in the -950 forums in the past half year, the latest 10 or so being what I could call not just a test but proven.

...

First step is to analyze your site as if this was the first time you've ever seen it.

...aaand... *this* is where most people fail.

Once your site is in shape, and you definitely feel it *should* work already, well, look through your inbound links with a critical eye. But first, correct any HTML, accessibility, dupe content etc. issues.

...

HoHum




msg:3440143
 5:57 pm on Sep 3, 2007 (gmt 0)

Thank you for the summary of your findings on this long, long ... oh so long topic Miamacs. I've read a lot of it, really, but just thought it needed a conclusion section at least starting.
You mention IBL's. We find that many of our completely natural IBL's just contain our domain name as the anchor text. We don't sell anything but other sites may say "You can find widget info at www.yourdomain.com". Isn't this a common way of people naturally writing on blogs, forums, other websites? In which case what can you do about it?
I've found many of the issues you have mentioned and we still have some syndicated content that is relevant to our users on the site. Isn't the author bio at the bottom of the page equivalent to selling a link - trading content for link. We've done well off the syndicated stuff but I'm wondering if the time is ripe for removing it. Any thoughts on this?

tedster




msg:3440199
 7:18 pm on Sep 3, 2007 (gmt 0)

Linking to the author of an article is a long established and honorable practice - from the early days of the web. I haven't seen any evidence that Google is concerned about it at all, and especially not to the degree of imposing a -950 penalty. In academia, such a link is nearly required so people can deepen their research on the topic.

Granted, the meaning of the phrase "paid links" is more than a little fuzzy, but I think Miamacs (above) has clarified the phrase very well.

HoHum




msg:3440217
 7:46 pm on Sep 3, 2007 (gmt 0)

That's reassuring to hear Tedster.

I will keep scrutinising the site then and looking for any links that look like they are paid for.

I agree with Marcia and Miamacs that you need to take a long hard look at what you have in front of you and not just blame google straight away. Do that then blame google...

So far I've fixed canonical issues, /index to / issues, multiple old re-directs, accidental paid links (put "nofollow" on them as they were genuine adverts) and quite a lot of dupe issues that have happened due to the site evolving and old pages getting left behind.

That worked for a week and life was good then fell again. Last year our traffic was stable (couldn't budge it) all year. This year it has been mostly up but bouncing a lot. The only seo we have done is based on Brett's classic, so perhaps old school.

To further blur the paid links line we have been approached by, presumably SEO's, to show their clients content on our site in return for cash (a few hundred 's per month). The content would be unique to us and contain a couple of links to their clients sites. The clients would be blue chip.

So its
1) Relevant/useful to site users
2) Original content
3) Links are within content - hidden
4) We don't write it
5) They pay us for it

An interesting twist to the paid links issue?

Robert Charlton




msg:3440251
 9:24 pm on Sep 3, 2007 (gmt 0)

An interesting twist to the paid links issue?

FYI... this is not new, by any means. Brian White of Google posted about it on his blog back in May....

Paid Link Schemes Inside Original Content
[brianwhite.org...]

From Brian's point of view (and he's on Matt's spam team), this violates Google's guidelines, no matter how you justify it.

Further discussion of this particular approach to paid links, though... whether it will work, whether it can be detected, etc... is really off topic on this thread.

[edited by: Robert_Charlton at 9:25 pm (utc) on Sep. 3, 2007]

europeforvisitors




msg:3440253
 9:32 pm on Sep 3, 2007 (gmt 0)

To further blur the paid links line we have been approached by, presumably SEO's, to show their clients content on our site in return for cash (a few hundred 's per month). The content would be unique to us and contain a couple of links to their clients sites. The clients would be blue chip....An interesting twist to the paid links issue?

I've had some feelers like that, too, from a couple of blue-chip companies. The discussion never got any further than "X is interested in buying sponsored editorial content" (an offer that I politely declined), so I don't know if the reason for the "sponsored editorial" was to provide a container for purchased links.

I have trouble seeing much value in Web advertorial, which is what "sponsored editorial content" is. In a magazine, an advertorial or "special advertising section" is flipped through by readers as they browse from page to page, so the advertorial pages get exposure. On the Web, where users have to select a page (via clicking) to see it, the point of advertorial is less clear--unless there isn't any point except the buying of links.

HoHum




msg:3440274
 10:06 pm on Sep 3, 2007 (gmt 0)

sorry to labour on an apparently off-topic subject but the relevancy as I see it is that I'm trying to understand reasons for penalties on my site (which takes us from 10-15k visitors per day to 3-5k which is depressing) - one of which could be syndicated content...

syndicated content:
--it's duplicated elsewhere
--it's usually advertorial nowadays
--in exchange for a link from your site you are given content

this is OK?

sponsored editorial content:
--its unique
--it's advertorial
--in exchange for a link from your site you are given content AND money

this is not OK?

(BTW I'm not saying our site has taken up this offer of money, we didn't, but we looked out of curiosity)

i see the ethical difference but am struggling to see why one will
probably be penalised and one wont?

europeforvisitors




msg:3440372
 1:18 am on Sep 4, 2007 (gmt 0)

i see the ethical difference but am struggling to see why one will probably be penalised and one won't?

Will either one be penalized? I think that's unlikely in most cases, since Google seems to err on the side of tolerance when pages or sites fall into grey areas. (If Google were as ruthless as some people like to think or pretend, marginal pages wouldn't show up in the SERPs and there wouldn't be any such thing as "reinclusion requests.")

wanderingmind




msg:3442825
 9:38 am on Sep 6, 2007 (gmt 0)

I would disagree with that, EFV. Google MAY or MAY NOT err n the side of tolerance. I would say its 50:50

Let us not try to say if G is tolerant or not. After reading WW for years, I think there is enough reason to say both are true. But either way, thats a never ending discussion...

And everyone,
Can we now get back to the -950 penalty?

Miamacs,

Could you tell me what to check if one doesn't have paid links on the site, have only done basic SEO and natural one-way links, what else could be the cause of a -950 penalty?

Internal linking is the next potential culprit?

kdobson99




msg:3444084
 2:56 pm on Sep 7, 2007 (gmt 0)

Ok, the day has come. I was hoping I could forever ignore the 950 penalty threads. I was posting on the first ever thread about it, then expanded the site out into the long tail and never really had any further problems until today. I've scanned the thread to catch up on recent 950 theories/solutions and will dive in. However, for the record I want to describe the problem so that when it is solved we have a reference.

Lets say the site is about Doctors (its not, but widgets won't work well enough). For several years various pages have bounced around the top results for "doctors". It used to be one of the internal pages, but it got 950'd in May. Since then the homepage replaced the internal page for the "doctors" search due to more external links pointing at it over time. Eventually I guess the internal page came out of 950 because today I notice it ranks somewhere around 200 (although I stopped paying attention to it...it is now the only "doctors" page that ranks above 950)

The site has a lot of info about the doctors niche, but most pages are part of a geo targeted long tail directory of doctors. Thus, there are pages about doctors in each state that give some statistics, and those state pages link to city pages were various doctors are listed in a directory format (no links to doctors, just clinic name, specialty, addy, phone etc)

In the past all of these pages ranked high for any search about doctors in various cities and states. The linking is fairly pyramidal with "doctors" being in most of the anchor text (ex: Chicago Illinois Doctors"), in the title and in the bolded header on each page. However most to the traffic coming to the pages actually came from people searching for names... like "Dr. John Doe". I would rank #1 or #2 for those searches.

Today I woke up and the homepage is nowhere to be found for "doctors". Not 950'd... just gone. It still ranks for a search for the domain name. However, each and every internal page that in any way has to do with doctors is 950'd. What is kind of interesting is that they no longer rank for anything... even without the word "doctor" in it. Meaning, I don't rank for a search for "John Doe" anymore.

In my past 950 experience, it seemed like the penalty was more keyword specific where a page would rank for certain keywords and be 950'd for others.

I actually think that there is some connection between the homepage disappearing and the internals going 950. The best way I can describe the theory is that google ripped away the authority the site had for the term "doctors" and thus the authority in the niche no longer passes down to the internals... thus they go to 950 since google thinks that they should rank based on history as being a part of the result set, but can't find the basis of the authority any longer to actually rank them so they send them to 950.

Just a theory. With that background, I'll dive in an start working through it and let you know what works. Problem is that until I get it solved, revenue will be down about $1k/day. Ooops.

kdobson99




msg:3445055
 3:39 pm on Sep 8, 2007 (gmt 0)

Part of my problem... in addition to my inbound link profile having lost quality (which is being fixed)is that my site structure is too horizontal rather than vertical. While I originally described it as being a pyramid, it's much more square than I thought. I am in the early process of collecting a ton of data on about 20 sites (besides mine) that are hit with the penalty. One thing the seem to have in common is "topic overlap" on multiple pages and poor backlink quality with too narrow of anchor text diversity.

Using my "doctors" example from above, lets say a site has a set of pages on doctors for each city and state. Lets assume they also have a set of pages for "hospitals" and another for "nurses" Now there are three pages for google to pick from for each geographical region. Then, if you start interlinking the pages, it get even worse. It doesn't have to be geo terms, could be individual pages with small product variations.

Can anyone think of any harm, except for the loss of authority from the main domain, in creating subdomains for each vertical category. In other words, have a subdomain for nurses, another for hospitals and another for doctors. Would google be able to keep them straight and separated after such a move?

If I take those verticals that don't generate much revenue and move them out to subdomains, and keep the vertical that pays the bills on the main domain, might I find some relief?

Again, I think my primary problem is too narrow anchor text from a recent influx of poor quality links to the homepage. However, others insist that this has a lot to do with internal linking structure and page topic overlap. Would a subdomain solve it?

SEOPTI




msg:3445062
 3:48 pm on Sep 8, 2007 (gmt 0)

I had 10 domains in -950 land and brought 10 of them back by deoptimizing internal link text.

HoHum




msg:3445074
 4:12 pm on Sep 8, 2007 (gmt 0)

is it possible to give an example of what you considered 'over optimized' internal link text?
plus how long were the domains 950'd and how long have they been back?

[edited by: HoHum at 4:13 pm (utc) on Sep. 8, 2007]

SEOPTI




msg:3445091
 4:33 pm on Sep 8, 2007 (gmt 0)

It doesn't matter at all how long a domain has been in -950 land, each domain can jump to prior positions. This is not a factor.

The domain will come back when Google has crawled and indexed the URLs which are the reason for -950. At the moment Google is hardly indexing, so it might take longer.

Just to make it clear, I'm talking about the whole domain being in -950 land, not single directories or URLs.

wanderingmind




msg:3445143
 5:22 pm on Sep 8, 2007 (gmt 0)

SEOPTI, how overoptimized were the internal links? Can you give some kind of an example? I have not consciously overoptimized internal links, but I have a domain in -950. Hence the question.

kdobson99




msg:3445170
 6:06 pm on Sep 8, 2007 (gmt 0)

SEOPTI - I guess what I am trying to say is that if I separate the page out into multiple subdomains do you think that it's going to be easier to keep the whole set of pages from falling into 950 next time? I'm still in 950 and working my way out, but was thinking that by dividing somewhat similar content and hierarchy to separate subdomains that it might keep the whole site from getting hit next time.

Right now I'm just trying to figure out how to deoptimize links to pages about states and cities. Right now I'm only using the state name and city names as anchor text. I'm not even including the accompanying keyword like "Illinois Doctors" or "Find Chicago Illinois Doctors", so it's pretty deoptimized already. (again, doctors is not my niche) Just think that having multiple pages where the inbounds use the same anchor text (like the links to my Chicago doctors pages and my Chicago Hospitals pages both use the anchor text "Chicago") might be part of the cause. I'll deoptimize the rest of the links to other pages too... but these are the majority of the site and the most likely cause if it is anchor related.

dibbern2




msg:3445205
 6:45 pm on Sep 8, 2007 (gmt 0)

One very common type of over optimized links:
<navblock>
<a>Find Chicago Widgets</a>
<a>Find Detroit Widgets </a>
<a>Find Atlanta Widgets</a></navblock>

-or-

<a>Widget Repair</a>
<a>Widget Sales</a>
<a>Widget Magazines</a>
<a>Widget News</a>

These are especially harmful when they are repeated over and over in a navbar or some other element common to all or most of your pages.

A safer linking approach might be:

Find Widgets in
<a> Chicago</a>
<a>Detroit</a>
<a>Atlanta</a>

Or, even better and much safer:
<a>Find Widgets in other cities</a> which links to a single page that consolidates all the Chicago, Detroit, Atlanta, etc links. In this example, there is NO repetitive link list in the nav scheme. Or, perhaps better yet, simply place all the consolidated links on the site's index page, and let your 'home' link take care of it.

A variation of this advice got me out of 950. I'm not saying to do this exactly, but instead trying to illustrate how many of us fall into dumb, spammy linking methods as our sites grow over time. When you think about it, these long lists that repeat a kw over and over are truly stupid looking.

TaLu




msg:3445210
 6:48 pm on Sep 8, 2007 (gmt 0)

My site continue +950 (already -30) for 15 days, I continue thinking this change is on Google, not on my site or others sites, when Im searching for certains keywords I can see other ex -950 sites, so I think Google finaly decide to retire this filter on some sites.

HoHum




msg:3445211
 6:48 pm on Sep 8, 2007 (gmt 0)

well it may not be independent variable in the actual 950 problem, but it gives a idea of the permanency of it on a site. our site has previously been OK then 950 and then not without us actually doing anything. but the - not 950'd/ 950'd/ not 950'd - cycle happened over about a week.

this year, since April/may we have been up and down. I've done things that have seen our site out of penalty for a few weeks but then fall back again for a few weeks.

again if you have been out of the penalty for a month or more then it gives confidence that it is really fixed and the site is not just being re-evaluated after a fairly substantial internal/external change.

many/most of you are a lot, lot better at SEO (OK a bit of an arse lick...sorry) than i am and any specific examples of problems solved (like internal link over optimisation) would be really helpful in examining my own site - I hope!

ah! thanks dibbern2

[edited by: HoHum at 7:07 pm (utc) on Sep. 8, 2007]

trakkerguy




msg:3445227
 7:08 pm on Sep 8, 2007 (gmt 0)

by dividing somewhat similar content and hierarchy to separate subdomains that it might keep the whole site from getting hit next time

kdobson99 - That's not a guarantee, and probably more trouble than worth. A site I had hit last December, was -950 for the domain and all subdomains.

100% deoptimizing (No keywords on pages at all) didn't cure it.

I got rid of the unneeded subdomains, added a blog with lots of new content and strong links, and finally came out of 950 July 23.

kdobson99




msg:3445240
 7:34 pm on Sep 8, 2007 (gmt 0)

Thanks trackerguy. Nice to know that it's possible for both a domain and subdomains to be hit at the same time. I'll get it solved and am willing to put some time into it to find a solution.

Once I get a handle on it I'm going to start finding these nice old sites that are 950'd and not making any money and buy em cheap, get them out of the penalty, then smile really big. Saw a blog where this strategy is suggested... sounds like a winner.

SEOPTI




msg:3445294
 9:45 pm on Sep 8, 2007 (gmt 0)

kdobson99, I would not think that far, I needed 9 months and all of my energy to win the -950 game, this algorithmic part almost ruined my business.

kdobson99




msg:3445301
 10:05 pm on Sep 8, 2007 (gmt 0)

It just seems that if the cause can be determined to be more on site than off that if you could afford to completely strip a site down to a core set of pages... (Home, about us, privacy, contact, one or to product pages) that the site should immediately snap out of 950 on a full cache. Then you could build out again conservatively from there. Has anybody tried it?

trakkerguy




msg:3445414
 3:06 am on Sep 9, 2007 (gmt 0)

completely strip a site down to a core set of pages... Has anybody tried it?

Yes, that is essentially what I did. And still didn't come out of penalty for some time after.

The content from old pages that no longer existed still seemed to keep it in penalty. Why do I think this? The site still ranked #1 for some unique keywords that only ever existed on those old, long gone pages.

sahm




msg:3445899
 11:16 pm on Sep 9, 2007 (gmt 0)

Thought I would post an update on my site...*fingers crossed* my site has finally escaped this penalty.

My site completely disappeared March 6th, came back April 10th, disappeared April 14th, then yo-yo'd in and out until July 8th. Since July 8th I have had almost record traffic, at least as good as last year. I noticed my traffic slowly dropping around December.

I don't know for sure what brought my site back, but I made two significant changes before my site came back.

I removed all interlinking between my web sites. Not sure this was a problem, but didn't want to chance it anymore.

I also changed the navigation on my site. I previously had the same navigation menu on my entire site (several thousand pages). I had a main menu and another menu broken down into sub-categories. I still have the same main menu on all pages but I actually further optimized my site by adding more keywords! So I know I wasn't penalized because of overoptimization. For different categories on my site I have different sub-menu's with keywords for that category. This is also why my rankings are better than they were before the penalty, because I now have more keywords that correspond with the content on the page.

It seems like my penalty was some kind of a "trust" issue. My site has been an authority in its niche for a number of years and it has returned to it previous positions and even better. There is still some fluctutations in the results from day to day, but only by a place or two now.

Hope this helps someone!

kdobson99




msg:3446026
 4:35 am on Sep 10, 2007 (gmt 0)

In my niche I have noticed that 13 out of 15 sites that I can clearly identify as being hit by -950 for a geo (state) search term (state + keyword) all have one thing in common...

13 of the 15 all have sitewide links to the pages for each state in the footer or bottom right navigation column. Some have anchors like "State Keyword" while others have only the state name in the anchor. It effects them the same.

When looking at the top 30 results in the niche, only 1 has these links completely site-wide and it's stranglehold is so firm on the number 1 spot that a nuclear bomb couldn't move them.

3 other sites in the top 3 appear to have the same types of geo targeted links but these links are different in two ways. First, they are more in the center content section of the page and second the sites cover multiple but slightly related niches. Although they appear to have the same kind of links, when the topic of a page changes from topic 1 to topic 2... so do the links. In other words, on the doctors pages the links point "State Doctor" pages, while on the hospitals pages they point to "state hospitals" (again, I'm not in the medical niche.)

This 161 message thread spans 6 pages: < < 161 ( 1 2 [3] 4 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved