homepage Welcome to WebmasterWorld Guest from 54.166.84.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 [4] 5 6 7 8 > >     
Google's 950 Penalty - Part 13
potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 11:01 am on Feb 9, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.

e.g:

Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9

But for each of the directories, i.e:

http://www.example.com/keyword1/

there is still repetition of the horizontal header nav link in the vertical menu:

e.g:

Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9

Keyword1 Widgets
Red
White
Blue
...

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.

I HATE THIS 950 POS!

I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?

p/g

"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

 

CainIV

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3586434 posted 2:28 am on Mar 25, 2008 (gmt 0)

Write it unique to that page of your website

Make it visitor friendly and not search engine biased. Disregard keywords and write for the 'SE Snippet"

There is no need to leave the meta description out. As steveb mentioned, it will likely do harm to remove it unless the tag itself is riddled with keywords and is a copy of other pages of your website.

OnlyToday

5+ Year Member



 
Msg#: 3586434 posted 4:35 am on Mar 25, 2008 (gmt 0)

Write it unique to that page of your website Make it visitor friendly and not search engine biased. Disregard keywords and write for the 'SE Snippet"

Sure if you only have a few thousand pages but what if you have millions of pages--that's going to take, like, a long time. With no guarantee of success or even a penny of return, BTW. However you are right, that is the proper way of doing things I shouldn't be so lazy...

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 11:10 am on Mar 26, 2008 (gmt 0)

I removed the meta tags that weren't "auto-generated" (copied from titles). My site has many pages now without Descriptions: about 2/3. I don't have time to write one for each; there are too many. Anyway, I concluded they're never seen (perhaps only 1%), and don't boost rankings, so it's a waste of time to write 500 unique new Descriptions.

The directories (which are landing pages) do get Descriptions, because they are seen. At least by those who aren't satisfied with just a page title.

If your keywords and/or descriptions repeat your title exactly, obviously it's going to look to Google as if you're trying to optimize those words and phrases. Repetition is a very common attempt at optimization--perhaps the oldest and most common. How many repeats are "overoptimization"?

Google seems to like using the Description for my home page sometimes even though there's plenty of text to pull for the snippet (75KB). It may even lock onto some snippets based on site history.

If you're going to remove dupe content for keywords/descriptions, don't forget how other search engines consider them. I don't know if MSN or Yahoo or others give them much weight.

For the last five years, Yahoo has only given me leads for one search phrase (top 10), and I think that's based on the Description for its own directory. So I don't worry about it.

MSN, I'm not sure about, either. Can't really see a huge basis for any SE thinking descriptions or keywords are very important, or much more important than page age or inbound links.

The 950 is an overoptimization penalty. How do meta tags like the Description look "overoptimized"? I sometimes inadvertently try to beat the 950 by thinking of what looks spammy, but overoptimization isn't necessarily the same as spam.

Overoptimization must mean either: too many facets of SEO on the same page/site, and/or too much SEO of one facet.

Examples:

Say, for the sake of argument, there are 50 ways to optimize your page. You've optimized all 50. That would be overkill, could be "overoptimization."

Say you target just one thing, a phrase. Your page has 50 different subtle variations of the phrase. That could be overoptimization, too.

I don't think the bar is that high, though. It's probably much less than 50.

p/g

(Two sites de-950d; one to go.)

bava_seo

5+ Year Member



 
Msg#: 3586434 posted 11:39 am on Mar 26, 2008 (gmt 0)

many of them even myself dont get definitive understanding regarding this penalities , by reading 100's of post i came to know the below mentioned points to regain our rankings back to first page
1) de-optimising the site
2) not going for paid links
3) taking off the link from where u have purchased the link
4) not over optimising the keywords in the site (decrease keyword density)
5) not using anchor text link for the same domain (example.com) for various keywords in the single web page
6) checking out duplicate content issue
7) back links from bad neighborhood

by clearing out this issues we can regain our rankings for particular keywords ? nothing has worked fine to me

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3586434 posted 11:53 am on Mar 26, 2008 (gmt 0)

-950 has nothing to do with purchased links and with duplicates/supplementals. This is a different issue.

[edited by: SEOPTI at 12:43 pm (utc) on Mar. 26, 2008]

OnlyToday

5+ Year Member



 
Msg#: 3586434 posted 3:02 pm on Mar 26, 2008 (gmt 0)

I am absolutely certain that I removed all my description and keyword tags on one day, March 16.

Yet now, even though the cache header claims the page was retrieved on the 17th or 18th or later the page that is actually in the cache is the old version with meta keywords and descriptions. And the -950 remains, obviously or this post wouldn't be a complaint.

Something very screwy going on here...

arubicus

10+ Year Member



 
Msg#: 3586434 posted 5:27 pm on Mar 26, 2008 (gmt 0)

1) de-optimising the site
2) not going for paid links
3) taking off the link from where u have purchased the link
4) not over optimising the keywords in the site (decrease keyword density)
5) not using anchor text link for the same domain (example.com) for various keywords in the single web page
6) checking out duplicate content issue
7) back links from bad neighborhood

2) not going for paid links
3) taking off the link from where u have purchased the link

I don't think this has to do with paid links or purchasing links. Never sold or purchased a single link EVER. Unless something looks like we did...if that is the case then anyone can bring down a site with this method.

6) checking out duplicate content issue

Although we have some redistributed articles I can assure you at least we carry the penalty for our unique articles as well. Now redistributed articles my not carry weight or pass PR through the site. This could disrupt the flow of internal structure. Longshot here but could be something to consider.

7) back links from bad neighborhood

Ummm hope this isn't the case. Links FROM bad neighborhood spell trouble to me as ANYONE can take out a newly emerging competitor by using this method. Links TO bad neighborhoods...tested out a nofollow on external links a while back to date. Should eliminate this possibility as no improvement came from this.

4) not over optimising the keywords in the site

I have been watching this for some time. I can find sites with a heck of alot more repetition and density than I could possibly think of doing. Sites returned for keywords we formally ranked for yet we are now -950. Even find results for pages with little links coming in from external sites. But INTERNAL PR (not necessarily tons of internal links - just enough to pass good PR) could allow such heavy repetition and density.

1) de-optimising the site

Having too many keywords in too many tags /html elements at once just might be a possibility...then again I can look at the top 200 sites returned and find just as optimized sites not -950d. Although some of them may be knocked down to a lower rank because of it, something to consider. PR could allow such optimization then again I can find really low PR pages that use about every html element for optimization and they are not -950'd.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 10:57 pm on Mar 26, 2008 (gmt 0)

"-950 has nothing to do with... duplicates/supplementals."

I agree, but 950 will often (always?) get thrown in with supplementals when you do a site search for supplementals. Perhaps a site search for supplmentals could now be called a site search for supplmentals and penalized pages.

outland88

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3586434 posted 1:33 am on Mar 27, 2008 (gmt 0)

Many people want to attach exotic theories to what the 950 penalty is. To me it really just returns to matching theory. Pages that are routinely escaping the filter aren’t saying enough about the search term to be considered overly optimized. On the other hand many pages escape because 25-75% of the page has nothing related to the keyword searched for. I take a different tact because if your domain deals heavily with the subject you’ve likely over-optimized regardless of intent. SEO is not bad in many cases but a natural development of an interest in the subject. In some areas writing about a subject rarely produces repetition where in others you can’t help it. In other words I’m not going to change what sells to the public and encourage further damage in other engines.

So in some cases I just add things like “We welcome your visit and hope you come again in the future”. Or on a few pages I put something to the effect. “Please excuse our text but this is the ridiculous nature of Google’s so called Universal Search.” Bingo, some pages pop right out of the 950 filter. This achieves two things. One, it dilutes the level of optimization (which is basically Google’s theory behind a bad bad page) and two the theming is reduced (you’re no longer talking about terms related to the subject).

In simplified terms my interpretation was Universal search lowered the optimization level except for what Google grandfathered in. This allowed fresh content to be seen that ordinarily used the search term less often. Google knew the fresh content couldn’t be seen though unless you targeted or penalized some sites that ranked well for some reason. Google employees may use the term OOP but in reality this is peculiar only to Google. I tend to think the term has more of a loose universal interpretation that is about the same over all search engines. With Google’s interpretation you could look at ten penalized pages, they claim to be over-optimized, and go “huh, what, where.” In my way of thinking an over-optimized page would be fairly obvious in any search engine.

This is just a very general appraisal. Google may now count the changes made to a 950 penalized pages now among many things. So it may become inescapable for periods of time.

bava_seo

5+ Year Member



 
Msg#: 3586434 posted 7:32 am on Mar 27, 2008 (gmt 0)

going through almost all the posts on the web i have came up with above mentioned points regarding penalties, if those are not the points to take in to consideration then what might be the points, reasons, issues to get -30,350,950.. penality for particular keywords ?
if you have any other experiences your opinions are most welcome to drop few lines here

arubicus

10+ Year Member



 
Msg#: 3586434 posted 8:36 am on Mar 27, 2008 (gmt 0)

going through almost all the posts on the web

That is alot of posts. Just playing.

what might be the points, reasons, issues to get -30,350,950.. penality for particular keywords ?

That is what we are trying to figure out...It may be a certain aspect or a certain combination... It may be multiple filters that when combined you get a level of downgrade. We just don't know and are working on it.

On another note of over optimization and keyword density/repetition...

I just ran across a site. I wish I could show you all. This site is #8 in Google with a PR of 3. This site had the same phrase virtually everywhere on the page. I counted occurrence of each word, 2 word combinations, and the complete 3 word phrase.

Visible Text
Word 1: 92
Word 2: 96 including plurals
Word 3: 69

3 Word Combination: 39
2 Word Combination Word 1 and 2: 89
2 Word Combination Word 2 and 3: 44

That does not include url, head title, description, keywords, alt, and title attributes.

Also has keywords in onmouseover events for bulk of the links which duplicates. Title attribute is the same way in <a> tags.

The keywords can be found in anchor text at least 30 times each. Combination about 20 plus. Well over 100 links on the page.

Main body has 1500 words. 480 of them are an actual writeup that contains a 10 links. The rest of the 1000 words are in links from navigation.

This site does NOT use <h?> tags at all. But all other elements are filled.

<b> Most of the links are in bold and in much of the text the keywords are bold.

Once occurrence of <i> combined with <b> with keywords.

Can't get any link info as yahoo has no records (the site ranks in Y! though different page same words. The page Y! shows has no link info either.) Google reports a crappy 20 from links pages on and off topic.

This site still remains day after day...

Now would you consider that a bit optimized? When I looked at it I laughed. My wife laughed and she said she couldn't even read it because it was so bad. (She doesn't even know anything about SEO stuff) You just read the same phrase(s) over and over and over. Anyway, if that is considered quality in a 1,000,000 result set...and my site is well designed, light, readable, as natural as I can make it...I just give up because I don't know what to think. I won't report it just because I wanna see what happens.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 5:57 am on Mar 29, 2008 (gmt 0)

When my main site was first hit with a 950 penalty last year, it had three effects:

1. Some keywords were knocked out;
2. Others were knocked down;
3. Others stood strong.

In the last few days, I'm seeing the second group making a comeback. What were demoted to top ten are back to #1. What were #1 are still #1.

Can't say for sure if the progress is from partial beating of the 950, or changes in the general Google algo. The keywords that were knocked out don't seem to have moved out of 950 hell yet; hopefully, however, they will start moving like the others, too.

I did something that should help Google identify the theme of the site, and perhaps "anchor" the keywords and phrases. I suspect that sometimes you might have to do "more" SEO instead of less to beat penalties.

Also, the keywords/pages that survived the initial 950 attack had solid internal linking, where the anchor text in links on many pages was also in their titles.

For example

<title>The quick brown fox jumps over the lazy dog</title>

yada, yada, yada,

Related Pages

Lazy Dog

-

And then there's a site section:

dog/

So there are many pages on the site with "Lazy Dog" links that link back to dog/

The common denominator is links to the dog directory only come from pages that contain "lazy dog" in the title. So the base claim the page is in fact related content is substantiated, as opposed to the abuse of anchor text links to pages that aren't really related, but only linked to try and pick up SEO points from anchor text/internal links.

I think the basis of the 950 penalty against sites for improper use of anchor text is the internal links are unjustified.

If Google is trying to identify which sites are using anchor text internally to juice their rankings, and which aren't, the title is very solid and very natural tag to check.

When my site was first hit I was at a loss to know not only why it was hit, but also why some keywords and key phrases weren't hit. The above theory is the best explanation I could come up with so far to explain why certain keywords survived and thrive.

I don't believe many inbound links were the saving grace of the keywords that survived. But I also noticed that the sites that were linking to mine, although they didn't use the anchor text of the top phrases that survived in the link, they did actually have those important phrases in the title tags of the page where they had the outbound links.

So I think the Google algo checks the title tag of internal and external links, not just the anchor text, which is the most common SEO advice. 'Get inbound links with the anchor text of the phrase you're targeting.'

You really don't anchor a concept unless it's in the page title. Other "anchoring" is superficial. The title tag is the anchor--not some link!

Really, this is just breaking down the old Google concepts of themes. You know Google doesn't like inbound links from unrelated sites. Okay, so what happens if Google takes that reasonable inter-website linking concept, and applies it within your site?

I'm testing titles now to discover to what extent you anchor a concept with title tags, but establish its ranking by frequency of specific key phrases in titles, and the justified frequency of internal links to the main site directory on the topic.

The 950 at its heart is probably imposed against webmasters who strayed from the fundamentals of natural site and page design.

p/g

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3586434 posted 7:52 am on Mar 29, 2008 (gmt 0)

I think the basis of the 950 penalty against sites for improper use of anchor text is the internal links are unjustified.

Many news sites (not to mention Wikipedia) are using very widespread "horizontal" linking almost like footnotes. These extremely frequent links are definitely not from "on-theme" pages, albeit the links themselves are quite relevant. And over the years, many sites have routinely linked to glossary entries and pages related to terms in articles. I don't see that this kind of linking has been hurting the sites that are using it.

I'm sure there's a point of excess in this kind of linking... and I don't know what it is... but I think that limiting your link anchors to terms partially contained in the titles is too restrictive.

OnlyToday

5+ Year Member



 
Msg#: 3586434 posted 5:05 pm on Mar 30, 2008 (gmt 0)

My site never was optimized much and two weeks ago I removed whatever residual items that might be considered "optimization," which only were attempts in the first place to help search engines deliver the right page. Yet the penalty persists. Many pages in the cache are more than two weeks old and that remains my only hope.

Through all this I am developing a strong and deep-seated animosity toward Google and have an increasing desire to find and promote an alternative.

[edited by: OnlyToday at 5:08 pm (utc) on Mar. 30, 2008]

[edited by: Robert_Charlton at 6:19 pm (utc) on Mar. 30, 2008]

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 1:45 pm on Apr 19, 2008 (gmt 0)

Well, I'm not seeing any deliverance yet for the last site of mine that is still bound by the 950 Penalty, even after removing most Description and Keyword tags, so I've got a new idea.

Based on the Cutts comment that it's a penalty for overoptimization, and one of my old habits was using alt tags on images to help ranking (a little), I'm going to strip my entire website of every single alt tag!

Now I've been putting this off because the site is over 1,000 pages. Looking for and removing each tag is tedious and boring. Unless it's automated.

I was hoping there was a way to batch process an entire site to remove every alt tag, and low and behold, Dreamweaver to the rescue!

How to Remove All Alt Tags from an Entire Website Using Dreamweaver Automation

Find and Replace

Find in: Entire Current Local Site

Search: Specific Tag: img

+ - With Attribute: With alt = [any value]

Action: Remove Attribute alt

Warning: Backup Site Before Automated Changes

I had long ago set my robots.txt not to cache images, so the continued use of alt tags is not justified before the Unholy Google Algorithm.

p/g

OnlyToday

5+ Year Member



 
Msg#: 3586434 posted 6:10 pm on Apr 24, 2008 (gmt 0)

>> Based on the Cutts comment that it's a penalty for overoptimization...

There must be more to it than that--if anything I've been underoptimized for the last 30+ days and the penalty remains even across the board. I'm getting 10% of previous traffic with all the pages ranking remarkably the same against each other.

I can't believe that alt tags count as optimization unless they are stuffed with keywords--mine are all just one or two word descriptions of the image.

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3586434 posted 9:04 pm on Apr 24, 2008 (gmt 0)

I think his comment was made to irritate webmasters. Sometimes it's better to just say nothing.

Example why it irritates people: if you change a lot of titles within your site you will also land in -950 nirvana, so the comment is pure nonsense.

[edited by: SEOPTI at 9:07 pm (utc) on April 24, 2008]

arubicus

10+ Year Member



 
Msg#: 3586434 posted 9:13 pm on Apr 24, 2008 (gmt 0)

The over optimization I believe could be a part of it but not the complete deal. I tell ya, I see far worse as far as optimization than anything I have ever done that is for sure. There is something else...or a combination of things.

brinked

5+ Year Member



 
Msg#: 3586434 posted 9:38 pm on Apr 24, 2008 (gmt 0)

Here is my take on "The End of Serps Penalty" or what you guys call the -950 penalty.

First of all why it is called the -950 penalty really does not make any sense. Not always does google show 1000 results for a given search phrase. If you are suffering from the EOS Penalty your site will be pushed to the end of the SERPS. The reason why you aren't always at the very last spot is because there are most likely OTHER sites that are suffering from the very same penalty and not everyone can hold the last position.

Now here is the most important thing. The EOS Penalty is not triggered by one specific thing. There are probably MANY different things that can trigger this filter. Everyone seems to be looking for that one common denominator and there isn't one. What joe may have been penalized for doesn't necessarily mean that Tim is penalized for the very same thing.

The EOS/-950 penalty simply means google saw something about your site that it does not like.

After much trial and error in experiencing this problem with MANY of my sites I learned that its of no use to look for an answer on forums because google is much more complex than that. What I did was read googles guidelines and found that I was doing things that they caution you not to do. These things may seem minor but google will place a penalty for them and fortunately by correcting these things I have successfully brought all 3 of my sites out of the black hole.

There is no quick fix remedy for the EOS penalty. Carefully review your site, read the google webmaster guidelines and make the necessary adjustments.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 10:54 pm on Apr 24, 2008 (gmt 0)

What you say is all true. The name -950 came from the early days before Google started cutting off result earlier on so many searches. Way back then, people were even noticing "middle of the results" penalties. Many of your points are also echoed in the -950 summary thread [webmasterworld.com].

So please don't be a tease - what kinds of violations did you fix?

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 1:28 am on Apr 25, 2008 (gmt 0)

"The EOS/-950 penalty simply means google saw something about your site that it does not like."

That's simply not true. Normally the penalty is applied to a PAGE, not a SITE. Google's algorithm at one time found something it decided to give a 950 penalty to, but that is not normally a sitewide judgement (although it is sometimes).

Additionally, you can change the entire content of a page, and every link to it, and the URL can still keep the 950 penalty, so "saw something at one time" has to be part of the sentence, because NOT seeing the thing it didn't like before does not neccessarily lead to un-penaliziation.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 2:23 am on Apr 25, 2008 (gmt 0)

Also, there are cases where a page falls to the End of Results for one search phrase but it still ranks normally for a different search phrase.

The patent about using phrase-based indexing to detect spam (over-optimization if you want) talks about maintaining a seperate "spam table". Resetting the threshold for spam as well as "cleaning out' the spam table are not clearly explained from what I can see - but there's no indication that this would be a continual process. More likely to be a once-in-a-while thing.

So I'd suggest making your changes as best you can. all at one time, and then waiting for 3-4 weeks at least before doing anything further.

andylc0714

5+ Year Member



 
Msg#: 3586434 posted 3:50 am on Apr 25, 2008 (gmt 0)

My site was up about one year, but there is a page that is till out of first ten pages of SERPs. It has ranked about 40 for a short period and then fell out. I always increase links regularly and control the keywords density carefully, but this page never back even to first ten pages..

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 12:30 pm on Apr 25, 2008 (gmt 0)

>> Based on the Cutts comment that it's a penalty for overoptimization...

> There must be more to it than that--if anything I've been underoptimized for the last 30+ days and the penalty remains even across the board.

If you overdo it, of course you can kill your site. Tedster said that a while back. That's why it's been suggested gradual de-optimization is a good idea. I don't know who came up with the 30-day reference point for the -950 recovery. Some webmasters claim they've seen recovery within a day or two; others say it takes at least one Google cache update.

> I'm getting 10% of previous traffic with all the pages ranking remarkably the same against each other.

Then there's another issue.

> I can't believe that alt tags count as optimization unless they are stuffed with keywords--mine are all just one or two word descriptions of the image.

If you go back and read earlier in this thread, Tedster posted the list Google has in its patent. It's far reaching. It basically says anything and everything on the page is subject to Google's review for filtering.

I removed my alt tags (over 9,000) on one large -950d site because they looked overoptimized. In some cases I had a few words; in many cases, though, the alt tag was the same as the page title. (The titles fill the available space.)

If the 950 is a cumulative penalty you need to look at how many parts of your site are being optimized and how many should be deoptimized.

For this large site, the last of three that needs to be freed from the 950, my approach is gradual and successive. The following steps were already taken based on comments in this forum but without much if any effect:

1) 99% of anchor-text keyword duplication removed.

2) all footers removed.

3) all keywords removed in home page link (now it's "Home").

4) all links from other 950d sites (I own) removed.

5) all spammy alt tags removed*

*recent

There are a bunch of other steps that are lined up if the latest round doesn't pan out. Including more content on pages which are thin.

I already started building another similar site to get the 950 lifted off it. It's ranking really well now and yesterday had the best traffic.

If the 950 doesn't lift off the main site I'm trying to fix, I'll move its content on the site that got its 950 lifted.

The Dreamweaver trick worked like a charm. Over 9,000 tags removed in about a minute. You can't Find and Replace the site because each alt tag was different. If I had searched every page for every alt tag in 2000 pages and manually removed it, it would have taken days.

If you don't have keyword-stuffed alt tags, I wouldn't put their removal at the top of your -950 removal plan. Prioritize according to what looks most spammy or against Google Webmaster Guidelines.

I don't use alt tags any more, including on the revamped site that's doing well for a few reasons. First I have blocked Google from caching images (so any alt tags that would help images get found in Image Search are irrelevant). Second, I don't think they have much if any value for page ranking.

Absolute alt tag removal from a website sounds extreme, so let's review a few basics from w3 about alt tags.

w3 says:

What are alt attributes useful for?

The alt attribute is defined in a set of tags (namely, img, area and optionally for input and applet) to allow you to provide a text equivalent for the object.

A text equivalent brings the following benefits to your web site and its visitors in the following common situations:

nowadays, Web browsers are available in a very wide variety of platforms with very different capacities; some cannot display images at all or only a restricted set of type of images; some can be configured to not load images.

[I think this is out of date. Apparently it was written in 1994. Well, obviously today all browsers show images. The only people I knew who ever turned images off were those with dial-up, back in the day, who wanted pages to load fast (text only). High speed internet is so common now, it's not an issue, and even those who don't have high speed generally don't turn images off (most don't even know they can!)]

If your code has the alt attribute set in its images, most of these browsers will display the description you gave instead of the images some of your visitors cannot see images, be they blind, color-blind, low-sighted; the alt attribute is of great help for those people that can rely on it to have a good idea of what's on your page search engine bots belong to the two above categories: if you want your website to be indexed as well as it deserves, use the alt attribute to make sure that they won't miss important sections of your pages.

[No offence to blind people, but if they can't see the image, I don't know that they can read very small alt tag text.]

What should I put in my alt attribute?

The generic rule for the content of the alt attribute is: use text that fulfills the same function as the image.

[That's exactly what I was doing.]

p/g

[edited by: potentialgeek at 1:05 pm (utc) on April 25, 2008]

Tomzen

5+ Year Member



 
Msg#: 3586434 posted 12:51 pm on Apr 25, 2008 (gmt 0)

First time reader and poster. A few comments:

P.S. The Dreamweaver trick worked like a charm. Over 9,000 tags removed in about a minute. You can't Find and Replace the site because each alt tag was different. If I had searched every page for every alt tag in 2000 pages and manually removed it, it would have taken days.

You can remove sitewide content in most of the editors which provide a solution based site approach. However obviously some more intelligently than others.

I already started building another similar site to get the 950 lifted off it. It's ranking really well now and yesterday had the best traffic.

If the 950 doesn't lift off the main site I'm trying to fix, I'll move its content on the site that got its 950 lifted.


Are you sure the new site isn't just currently enjoying the "honeymoon period" I read about elsewhere on this forum? I have experienced the same with a number of sites which were enjoying good traffic just to suddenly drop to next to nothing after a period of time. Even taking the same content onto a new domain does seem to exhibit the same behaviour.

About the -950/EOS penalty, so far I find the observation of multiple factors being the cause with several of these being dependent on other factors making an universal fix impossible the most plausible.

brinked

5+ Year Member



 
Msg#: 3586434 posted 2:50 pm on Apr 25, 2008 (gmt 0)

"The EOS/-950 penalty simply means google saw something about your site that it does not like."

That's simply not true. Normally the penalty is applied to a PAGE, not a SITE. Google's algorithm at one time found something it decided to give a 950 penalty to, but that is not normally a sitewide judgement (although it is sometimes).

SteveB, 2 of the three times I experienced the EOS penalty it was my ENTIRE website.

On to Tedster who asked which problems I fixed. I had 2 sites that were EOS SITEWIDE. The first one after making MANY small adjustments to no avail...I read up on googles guidelines and noticed something about "keeping the total links on a page to under 100" well my total navigation exceeded 100 links, so what I simply did was break up my navigation into 4 main categories and when you click a category its sub items will be expanded. Doing this the penalty was lifted a few short days later...I remember it was a sunday I noticed a huge spike in my statcounter stats and saw I was ranking for a crapload of terms...I did a fist-pump and shouted some obscene remarks and my girlfriend thought i was nuts...one of my best memories at the computer.

The 2nd time which also was SITEWIDE took much longer it has been in penalty for over a year then I thought of something...every page has very little content and its running the drupal script which has a login box and create account/forgot password box...so i thought to myself maybe google thinks this is a "need to be logged in to view content" kind of page. so i removed it...about 2 weeks later...regained ALL my rankings.

Call it a coincidence or whatever you will but I am truly convinced that there are a crapload of triggers that can offset googles penalties.

[edited by: tedster at 4:01 pm (utc) on April 25, 2008]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 3:59 pm on Apr 25, 2008 (gmt 0)

when you click a category its sub items will be expanded

Aren't those links still in the source code for the page, then?

brinked

5+ Year Member



 
Msg#: 3586434 posted 4:07 pm on Apr 25, 2008 (gmt 0)

No, the way you're thinking about is another way of doing it which simply hides the child links usually with javascript. The way I am talking about is where there are only 4 main links. so for example if you visit www.website.com you will see 4 links on the left column such as:

red widgets
blue widgets
green widgets
yellow widgets

when you click red widgets you will be taken to www.example.com/red-widgets.html and the navigation will look something like:

red widgets
-red sub 1
-red sub 2
-red sub 3
-red sub 4
-red sub 5
blue widgets
green widgets
yellow widgets

Also, getting back to my previous post. It would probably be more common that a sitewide EOS penalty would be in place rather than have the penalty reflect one page because of the use of scripts that cause every page to pretty much be the same in uniform. If 1 page is hit with the penalty it would probably be much easier to diagnose simply because it would most likely mean that google saw a problem with the content on that page rather than the global aspects of the site.

[edited by: tedster at 2:42 am (utc) on April 26, 2008]

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 1:55 am on Apr 26, 2008 (gmt 0)

"SteveB, 2 of the three times I experienced the EOS penalty it was my ENTIRE website."

That just reflects more on your website creation than the penalty itlsef.

In any case, sometimes the penalty is sitewide and sometimes it is only page-specific, so it is not correct to state it has to be a sitewide phenomenon.

brinked

5+ Year Member



 
Msg#: 3586434 posted 7:57 am on Apr 26, 2008 (gmt 0)

I never stated it can only be sitewide...in fact that is the complete opposite of what I am trying to get across. What I am simply trying to get at is that it can be applied to an individual page OR the entire site OR a certain number of pages etc etc etc.

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3586434 posted 2:52 pm on Apr 26, 2008 (gmt 0)

For me this nonsense has always been applied sitewide.

This 212 message thread spans 8 pages: < < 212 ( 1 2 3 [4] 5 6 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved