homepage Welcome to WebmasterWorld Guest from 54.224.179.98
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 212 message thread spans 8 pages: 212 ( [1] 2 3 4 5 6 7 8 > >     
Google's 950 Penalty - Part 13
potentialgeek




msg:3570326
 11:01 am on Feb 9, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.

e.g:

Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9

But for each of the directories, i.e:

http://www.example.com/keyword1/

there is still repetition of the horizontal header nav link in the vertical menu:

e.g:

Keyword1 ¦ Keyword2 ¦ Keyword3 ¦ . . . ¦ Keyword9

Keyword1 Widgets
Red
White
Blue
...

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.

I HATE THIS 950 POS!

I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?

p/g

"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

 

JeffOstroff




msg:3576858
 10:16 pm on Feb 16, 2008 (gmt 0)

Our site is very typical of many of the other web sites who enjoyed top 10 ranking for years in Google, then lost without reason. Most of our pages are toolbar ranked PR=4 or PR=5. I don’t give much credence to PR, it has nothing to do with your rank. Here's our incoming links:

Total links from unique domains: 306
Total incoming links: 3,610
Total incoming links to homepage: 2,549

We created our domain name in 1998, so we definitely have longevity. We enjoyed top 10 rankings for nearly all our prized search phrases. Then like many of you, we lost our rank in late 2005 to 300-400. I did notice hundreds of scraper sites who duplicated our content and appeared higher than us with our own content. Could not figure out how Google could be so boneheaded to rank a site 6 weeks old over our site, 8 years old. We sent out 150 DMCA notices to Google and web hosts to get hundreds of sites shut down and out of Google’s index. Then our rank came back on June 27, 2006 on a major Google update, then we lost our rank again last March 2007, and never got it back, sort of, until now. Keep in mind we NEVER did any black hat seo techniques ever. Our pages are content rich, and reader friendly first, search engine friendly second.

Now with 950 penalty, when we did a search on Google, our home page would come up, often ranked 200 or below, but our intended keyword related page, call it xyz.htm, would be a no show, or it would show up in the 900 range on Google’s SERP (Search Engine Results Page). Sound familiar?

Here’s 3 major steps we did 2 weeks ago that seemed to help.

1) I went onto one of my pages, let’s call it 123.htm, which was loaded with < widget > photos, as it is a < widget > gallery. This used to be #1 on Google! When I checked it out, all the photos were slightly keyword stuffed in the “alt tag”. Now we were not “black hat” SEOs just stuffing them endlessly, each photo alt tag had only 5 or more keywords, a reasonable amount. But I can’t help but wonder if Google frowns down on seeing similar search phrases repeated often in a bunch of photo alt tags. It does not look very natural, especially since most normal alt tags on web sites might say something like “photo of new house” or “graduation picture”.
Alt tags are really intended to be truly descriptive placeholders for users who have images turned off on their browser. So imagine someone seeing “widget, widgets, blue widgets, red widgets, green widgets, photos of widgets” as an alt tag. You want to simply read “photo of blue widget”, and little more. So we removed alt tags from many of the photos, and on several other photos we simply reduced to 2 or 3 keywords.

2) I saw some people commenting how footer tags at the bottom of their page may have hurt them. I noticed we had a footer tag at the bottom of all our pages that links back to the home page, it looked something like this:

“Example.com Home page for Widget Keyword Advice”

Could be construed by Google more as an “optimization” rather than a link back home. So I revised the entire link to read simply “Home”. End of story, nothing else fancy needed there.

3) We also have affiliate program links on our site, and each page might link to the same affiliate store several times, one lengthy page had maybe 10 links to the online retailer affiliate program that we belonged to. I reduced links these in half.

After all that, we had to be patient, waiting until Google updated our pages in the cache, which took about 7-10 days depending on the page.

Last week I re-ran our ranking of one of my pages, and here’s what we got for a tremendous improvement:
Rank Rank
Before After
68 46
457 34
59 44
134 230
142 69

Another page I tried this on fared much better coming emerging out of the 950 slump:

Rank Rank
Before After

957 368
27 24
871 871
140 116
450 194
450 275
828 302

You can see significant improvement after just 2 weeks. The ranking appears to be floating upward by the day also. Lastly, a week ago I went into our Google Webmaster tools account and submitted a reinclusion request, telling Google everything I had done, and asked them to take a look at our page.

I still don’t know if the fixes I implemented above were the solution to my 950 woes, but they seem to be helping.

[edited by: tedster at 12:11 am (utc) on Feb. 17, 2008]
[edit reason] moved from another location [/edit]

petehall




msg:3583321
 12:10 pm on Feb 24, 2008 (gmt 0)

I have finally completely re-designed my troubled 950'd site and it is having immediate positive effect - even on competitive terms.

This is excellent news I will post with a more detailed report later in the week once things have settled.

potentialgeek




msg:3585420
 8:54 pm on Feb 26, 2008 (gmt 0)

The alt-tag changes sound interesting. I've been wondering about them. I just don't see how Google would pay too much attention to them in the 950 context unless they're thumbnails (with links; therefore the same as stuffed anchor text).

I removed all my thumbnail alt-tags but didn't see any difference. But I still have images with alt-tags the same as the page titles (H1 tags).

The other day I was trying to think outside the box on the 950 penalty. Forget everything posted here. Reset.

I thought of the one and only comment from Matt Cutts that it's an "overoptimization" penalty.

If you start with only that information, consider everything you've done on your site to boost SEO. Even the stuff you do without thinking to help SERPs. Come up with the list of everything and then dial down one or two SEO things at a time and see if it works. It could be one thing that freaks out Google, or a combination.

My list includes:

* Each page with a unique title tag

* Description the same as the title tag

* Keywords the same as the title tag

* Alt-tag the same as the title tag

* H1 the same as the title tag

* File name the same as first one or two words of title tag

I know this looks spammy. The site was started years ago. But Google says Keyword tags are very insignificant and Descriptions don't help SERPs. I don't see how alt-tags could be much more significant than either Keywords or Descriptions.

The unique title tags don't get flagged by Google for being too long; they all fit. And there's nothing unusual about title tags matching H1 tags. Alt-tags that have the same text as descriptive page titles are natural for images.

But... when you stop and step back... you can see how many uses of the same text could look auto-generated, e.g., spam.

Google, however, can compare the TrustRank and IBLs, plus the age of the site, before concluding a page is spam. I just don't know if it knows how to do that.

Has anyone seen any benefits to Google from its 950 Penalty? Complete waste of time and money from my point of view. I'm losing money from it but so is Google.

p/g

gyppo




msg:3585493
 9:57 pm on Feb 26, 2008 (gmt 0)

Has anyone seen any benefits to Google from its 950 Penalty? Complete waste of time and money from my point of view. I'm losing money from it but so is Google.

Not if you're spending more on Adwords ;)

CainIV




msg:3585505
 10:04 pm on Feb 26, 2008 (gmt 0)

potentialgeek - I have gotten any and all of my older sites out of this penalty last year....

[edited by: Robert_Charlton at 1:18 am (utc) on Feb. 27, 2008]

JeffOstroff




msg:3585520
 10:17 pm on Feb 26, 2008 (gmt 0)

I don't think the 950 penalty affects Google either way, they think they are doing a good thing by sending your site down to 950. Their algorithm probably works for the most part, but once in a while a good site gets sucked down into it.

Now potentialgeek raised an interesting point here. Maybe if Google sees for example "red widgets" in the title, in a few H1 tags, in several places in the content, and in an alt tag, maybe that does look very "planned", very optimized, and they might spank you for it. I would think this optimizing would be good as long as it is indeed related to the target keyword and relevant. But then again, my Name's not Sergei.

But Google's algorithms now are looking at other "related" words to determine relevance and produce search results. For example, loot at these 2 fictional pages below:.

Let's say you have an article that talks about PCs, and you want to show up for "IBM".

1) This page Google might expect to see the words Dell and HP on there. So you add those keywords also.

2) Your second page has IBM, Colgate toothpaste, and Charmin toilet paper mentioned on it.

Which 2 of the above pages would you rank more relevant for "IBM" if you were Google?

Think about optimizing for your keywords, plus "allied industry" related themes into your content.

[edited by: JeffOstroff at 10:21 pm (utc) on Feb. 26, 2008]

JeffOstroff




msg:3585532
 10:27 pm on Feb 26, 2008 (gmt 0)

CainIV what did you do to recover your sites last year?

CainIV




msg:3585592
 11:37 pm on Feb 26, 2008 (gmt 0)

Hi guys. I had the same issue Jeff in 2005 - two of my websites got hammered when the update came in September.

I fell to about position 400 for my own domain name at that time. In 2006 both websites made small comebacks bet then were hammered with the -950 (both websites were the last possible position in each of their respective main and secondary searches across many of the pages of both websites)

I got sick and tired sometime around the spring of last year of seeing this issue with both websites. Basically what I did was start at ground up revisiting anything that could cause indexing issues, server issues and SEO issues on the website.

Here is a list of what I did. Note that some of this made sense for me and might not for your website. Basically I think for some websites, to get out of this penalty might require a change in thinking.

1. Removed many of my outbound links, and checked that each link out went to only quality websites. When in doubt, I removed the link

2. Changed the navigation entirely to not include keyword variants. For Red Widgets, I simply had a section and used the link 'red' like this:

Widgets:

Red
Blue
Yellow

Removed all footer navigation links to product pages and simply added a top anchor so that the user could get to the top. I left the privacy, terms, Copyright statement, sitemap links at the bottom but removed all of the rest.

Removed the footer link home from all pages to the index page as the domain name included the keyword. For some websites this works like a charm. For -950'ed sites it might be hindering more than helping

3. Setup my hierarchy as a modified pyramid. Main index links in to main categories, those all do link across to each other and then the product pages only link up to their parent, and across to related products in text on that page. I will say that for many websites this is not an issue, it may have been for me.

4. Either edited entirely or removed meta keywords from the website. The tag is useless nowadays for rankings. If you want to rewrite it, use 4 - 5 misspellings of keywords for each page and make the tag unique. My take is remove it altogether.

5. Enough has been said re meta description tag where everyone knows to make it unique. I often wrote the tag to be about 240 long instead of 152, but wrote the first 152 unique for the snippet in the search engines. I then wrote more unique text after that hoping Google would see it (but would obviously not show it)

6. Made sure each title was unique on the website and described the product or service in a logical and meaningful way (not a list of keywords) Simply made sure my keyword was close to the front of the tag once, and some variations were after.

7. Had a friend help me rewrite pages for the visitor, then sprinkled in keywords sparingly. Stayed away from a ton of variations as those variations are already there naturally when someone writes in a non-SEO context

8. Rewrote all alt image tags to only describe what the image is, nothing more. For alt images to home, I simply used 'Home'. For alt images of specific products, I wrote the exact product name in. I wrote the alt images as if I knew nothing at all about SEO.

9. Had a friend check the server, the header returned, 404 pages. Also did a thorough check for duplicate content, especially the index and next level pages. I had to rewrite some pages because of content theft.

I cannot emphasize enough the impact of server issues on rankings.

10. Took a hard look at the way pages linked to each other in content. I made a pact with myself :) to not link from any one page in text to more than 4 other pages by default. Every one in a while I broke the rule, but very rarely.

11. Ran Xenu Link Checker and was thorough about clearing up any broken links and images.

Basically, I had to look at both websites in a new way. I knew that it if I simply left the sites as is, they would not recover enough to give me any kind of success.

What I found was that for one website, the badly hit one, it began recovering and showing signs of life within 2-3 weeks. It bounced to about position 450 initially, something it had not done in a long time.

After that first push, it began moving up little bits each day. I helped by trying to only get links I knew would help trust.

The second website started recovering about one month after the changes. I mad a bigger bounce to about position 200. I proceeded to do the same thing with that one - focused on getting links from highly trustable sources only to push it up further, which it did.

Once both sites made it back to 50 land, I started more trustable links. Both are now ranking well again.

What worked for me may not necessarily always work for everyone. I will say though that it is obviously better to make changes which might impact change that are logical and well thought-out, than it is to sit and wait.

steveb




msg:3585689
 1:06 am on Feb 27, 2008 (gmt 0)

if you look at the 950 results for any significant term, you can see how Google "benefits". Aside from the collateral damage sites that should not be there, there are hacked edus, sites being penalized for adult keywords, subdomain crap, etc. The 950 does accurately hit a lot of thinsg that should be hit. The problem is it's batting average is godawful since it hits way too many things it should not.

CainIV




msg:3585698
 1:17 am on Feb 27, 2008 (gmt 0)

You are dead on steveb. The biggest problem is diagnosing the beast, which may never really happen.

Kinda like getting lice. Throw out the towels, clothes and start again :)

petehall




msg:3586244
 5:22 pm on Feb 27, 2008 (gmt 0)

I can recommend a complete re-design as a fix for the problem !

I think if your site gets caught, there are so many factors to consider its easier to start again.

The site home page came back after just 24 hours and now all the inner pages are returning, one by one as they are being re-indexed.

A lot of effort... and who says don't design (or re-design) for Google? ;-)

arubicus




msg:3586456
 9:13 pm on Feb 27, 2008 (gmt 0)

Wow I freaked. I was replying but when I submitted it the thread was closed. I see they split it once again.

Ok so here is what I wrote:

A complete redesign and restructuring is what we are doing. About 85% complete.

What CainIV has describes is essentially what we are doing. Guess we had the same line of thought.

Removed all footer navigation links to product pages and simply added a top anchor so that the user could get to the top. I left the privacy, terms, Copyright statement, sitemap links at the bottom but removed all of the rest.

We only have and always had just home, about us, copyright, tos in footer. We did nofollow those links as we didn't need any PR flow to those pages. We have a link back to the home page in our navigation that is NOT a nofollow. Gotta have a link back home somewhere.

2. Changed the navigation entirely to not include keyword variants. For Red Widgets, I simply had a section and used the link 'red' like this:

Widgets:

Red
Blue
Yellow

If anything may have appeared spammy, "I said may", this was it. Our home page wasn't a keyword but a proper name so no problem there. But our sections and sub-sections were. And yes there was repeats (not even close to some of the junk you find in the serps mind you). But we did cut down on needless repetition and went to a natural type navigation as CainIV has pointed out above.

3. Setup my hierarchy as a modified pyramid. Main index links in to main categories, those all do link across to each other and then the product pages only link up to their parent, and across to related products in text on that page. I will say that for many websites this is not an issue, it may have been for me.

We have a modified pyramid also.

Example would be:

Home Page

Food <- Main Category
Recipes <- Sub-category
Cookies <- Sub-Sub-category

Another Topic <- Main Category
Something <- Sub-category
Something about Something <- Sub-Sub-category

The reason I am using "Food" here is for understanding sake.

We have a multi-topic site so our categories aren't completely related so I don't cross link the categories. So we don't link between "Food" and Anther Topic in the example above. We have not linked them for about 5 years.

So our home page links to "Food" and "Another Topic" but not their sub-categories. The home page features articles and discussion from the various different categories we have. (think About-dot-com here)

The main "Food" section features latest articles and discussions in the main body of that page for it's topic coverage.

Our old navigation structure use to be "Food" then links to "Something Recipes" "Something Else Recipes" "Something Again Recipes" in the navigation -- featuring articles in the main body. Each of those sections is where our articles were categories and listed. Those sections cross linked also as well as the articles. But we did try to avoid going overboard and not to repeat too much -- "Such as the word "Recipes".

Now our navigation structure for "Food" would be "Recipes", "Another Food Topic", "Yet Another Food Topic". And there is NO repetition if possible.

The "Recipe Section" features sub-categories, such as "Cookies" in the main body with descriptions of what the user would find in each category. The left navigation is bread crumbed style which does not cross link other "Food" Categories but links back up the structure.

The "Cookies Section" features articles and discussions for that topic and the left navigation links back up the structure. If there are alot of articles/discussions there are pages that list "more articles" and "more discussion". Those "more" sections are a complete list and remains most static. The "Cookies Section" will change as it features the latest articles and discussions so the links will rotate.

Articles again have a navigation linking back up the structure. The old style cross linked section (with keyword repetition in the navigation) many time could have thrown those overboard. Articles will link to related articles and discussions. Why wouldn't I present that to my visitors. We could eliminate this as keywords tend to build up there. It is natural for that to happen. Those do rotate to give exposure to different articles and discussions. I can easily choose the latest but as we are database driven and a VAST site...difficult to just make them completely static. Just a consistently evolving site.

Hard to explain but when viewed...A very very simple and easy to navigate structure. I mean you can't get any more simplistic as our new structure.

4. Either edited entirely or removed meta keywords from the website. The tag is useless nowadays for rankings. If you want to rewrite it, use 4 - 5 misspellings of keywords for each page and make the tag unique. My take is remove it altogether.

Got tired of them so I got rid of them.

5. Enough has been said re meta description tag where everyone knows to make it unique. I often wrote the tag to be about 240 long instead of 152, but wrote the first 152 unique for the snippet in the search engines. I then wrote more unique text after that hoping Google would see it (but would obviously not show it)

Always had ours unique but redid...errr...redoing some of ours anyway. Using a more natural yet promotional feel rather than completely keyword focused. Not redoing articles as we will be filtering out the old with new and revised copy.

6. Made sure each title was unique on the website and described the product or service in a logical and meaningful way (not a list of keywords) Simply made sure my keyword was close to the front of the tag once, and some variations were after.

Titles for sections will of course have a keyword or phrase. Can't help it really. But will use a more promotional use of language. Bit more snappy and stands out from the rest. Articles will use be the article title. Articles will be replaced so titles will change and those too will be a bit more snappy. We will be talking about SUBJECTS not KEYWORDS. Not focusing too much on keywords and let that happen naturally.

7. Had a friend help me rewrite pages for the visitor, then sprinkled in keywords sparingly. Stayed away from a ton of variations as those variations are already there naturally when someone writes in a non-SEO context

8. Rewrote all alt image tags to only describe what the image is, nothing more. For alt images to home, I simply used 'Home'. For alt images of specific products, I wrote the exact product name in. I wrote the alt images as if I knew nothing at all about SEO.

9. Had a friend check the server, the header returned, 404 pages. Also did a thorough check for duplicate content, especially the index and next level pages. I had to rewrite some pages because of content theft.

Yeah rewriting and replacing content but that work will be done when we get the structure completed. Yes use natural language and talk about SUBJECTS not KEYWORDS.

Not much for image use that would warrant the use of an alt tag. I just used the tag but put nothing in it. ""

Headers are fine...Always have been.

11. Ran Xenu Link Checker and was thorough about clearing up any broken links and images.

Found a mess of broken external links. Working those out. Nofollowed external links for the time being and I need to not vouch for any external links until I make it through all of them. Ummmm that is ALOT to go through. I wish now I had 100 pages for our site. Would be sooooo much easier.

What I found was that for one website, the badly hit one, it began recovering and showing signs of life within 2-3 weeks. It bounced to about position 450 initially, something it had not done in a long time.

Yes this is EXACTLY what I noticed also. Many 950'd pages made a jump to mid 200-500's for sections that we already converted.

Our only downfall is that soon after we eliminated old 301 redirects from long ago URL changes. This cut our links way down to our articles. But I wanted a re-do as we were scraped, copied, bamboozled from enjoying such high rankings for years. Just imagine how bad that was because we are a general multiple topic site...EVERYONE and their pet moose was after our stuff. Our homepage, according to yahoo, has about 1030 external links into our home page and 3000 site wide. Those are natural we never pursued links but an occasional promotion or directory.

Links are going to be last on our list. Will be working on getting some new links as we redo and create content. Hey, not for rankings but we gotta find extra traffic somewhere :) We don't have much choice now.

[edited by: arubicus at 9:44 pm (utc) on Feb. 27, 2008]

arubicus




msg:3586476
 9:32 pm on Feb 27, 2008 (gmt 0)

Too add to my post.

Using Google Sitemaps it shows only 24 pages out of thousands that have internal links.

9 pages with external links. Home page has about 600. Those 600 are mainly from NEW links that have been acquired naturally and also from articles we wrote for distribution. All of our OLD links are just not there. From quality sites or even scraped junk sites. None are there. Also the 600 contains links from a list that has been passed around as our site being a place to submit articles to. That really fudges up our link profile.

Our home page is a PR5 has been for years but many internal pages even category pages are now not showing PR. Cutting out our 301 redirects had an effect on article links and some sub-sections but our categories have remained static (no url changes) and external links coming in are unchanged. PR internally just does not flow. Robots.txt is fine all links are fine.

[edited by: arubicus at 9:50 pm (utc) on Feb. 27, 2008]

JeffOstroff




msg:3586496
 9:46 pm on Feb 27, 2008 (gmt 0)

Remember, Google only shows maybe 10% or more of the totallinks you have coming into your pages. They don't want you to know everything. Yahoo shows a lot more.

arubicus




msg:3586497
 9:51 pm on Feb 27, 2008 (gmt 0)

Yeah I realize that. Just find it odd that only new links show up. Could be a time frame thing? No in-link shows that is older than 1 to 1.5 years.

bwnbwn




msg:3586545
 10:29 pm on Feb 27, 2008 (gmt 0)

I feel the ones that came out of the 950 without much trouble was due to an algo filter and those such as myself that have done about everthing possible.
1-Reduced repeated keywords
2-no follow on bottom links
3-Complete site overhaul reduced the duplicate content (repeated product pages) and much much more

Went from 900 to 800 for single keyword was ranking for years on.

I feel this isn't an algo problem but a "Human Added Filter" to the site to keep if from coming up in a search term were nothing you do can bring ya back.

Site still comes up in competitve keywords but not single but double keywords.

At the Pubcon I brought this up in a circle of us talking about various issues and I stated this fact about possible human added filters

Well to my suprise One of the ones in the group said his wife was a Google Employee that got a list to go over each day and I assume they judge the site on a number basis the list are then compiled the numbers added and presto you have a human filter.

Based on the Google human editors now remember there are what 10 thousand of them they go over 100 sites a day that is what a million sites a day.

Just my thought on this 950 monster that some pop right out and others do everthing possible and nothing maybe a 100 place move...

And really as you see most of the ones that are in the most trouble are old sites that did nothing to get put in the 950 other than rank for a search term for a number of years.

Ranking = review = human edit possible filter = 950 = (:<)

arubicus




msg:3586576
 10:55 pm on Feb 27, 2008 (gmt 0)

And really as you see most of the ones that are in the most trouble are old sites that did nothing to get put in the 950 other than rank for a search term for a number of years.

Ranking = review = human edit possible filter = 950 = (:<wink

Yep beginning to think this is it. Site has been around for 7-8 years. Enjoyed great rankings from targeted and non-targeted keywords and phrases. Anything we put on our page we would rank for. We are about-do-com like in a way and we ranked just below them and many time a hair above.

Many years of hard work down the the ol' crapper. Still make money but nothing like having Google to help out with traffic and above all exposure.

Seems as if we gain ground and get knocked back. Gain ground and get knocked back. For two years now. If it is human reviewed, I don't know what they seen that was sooo bad to have this happen. -30 Most possibly, -60 pushing it, -950 you got to be kidding, de-indexed or 86 PR flow/supplemental is a bit ridiculous. I believe that there is NOTHING we can do to fix it other than a cleanup (what to do for a cleanup is beyond me) and a reinclusion. Which we will do. If that don't work...

Does my site suck arse? Judging by some of the junk in the serps now? Geeze we are saints compared to that. Which makes it hard to figure out what to clean up. For every thing I can think of...I find sites that don't budge that do that very thing. Just don't know.

steveb




msg:3586890
 9:46 am on Feb 28, 2008 (gmt 0)

One thing sure about 950, human reviewed is not involved. No way would many things hit with a 950 get penalized if seen by a human. In fact, the certain thing is human review is not involved. It;s strictly an algo penalty, which is why it is often right, but sometimes hopelessly wrong.

arubicus




msg:3586955
 11:28 am on Feb 28, 2008 (gmt 0)

Ok one question. Algo based or human reviewed...would there be variations that require a review to have the penalty lifted?

For us the penalty was/is for most everything we searched for. This could be unique string of text lifted out of the body of an article with no real keywords the page was targeting. We have seen pages LINKING to that page (using random text from the article) and it would be in the 400 area and the exact page would be -950'd. Like one article linking to another and not necessarily a main page with more pr flow.

steveb




msg:3587626
 12:16 am on Feb 29, 2008 (gmt 0)

It's common if a page has a 950 penalty to not even be seen in the results because a site has two other pages that naturally rank 253 and 562 for the term.

JeffOstroff




msg:3587640
 12:35 am on Feb 29, 2008 (gmt 0)

arubicus, that's the magic question righ tthere, do you need human intervention by way of a reinclusion request.

Google's Sitemaps interface tells you it may be weeks or months for them to get to your request. They don't respond to you either, so who knows if it floated up on it's own or if it's the reinclusion request.

CainIV




msg:3587650
 12:45 am on Feb 29, 2008 (gmt 0)

There is no human review to 950, Steveb is absolutely correct. Pages that were affected by 95o changed and moved up in rank once that methodology I wrote about was applied and the new page was cached in the index.

The problem is, knowing 'what part' of those changes tipped the scales. I doubt I will ever conclusively know, but the point was change - enough to tip the scales.

To say that human evaluation was a part of the process would be saying that Google engineers actually monitored and cared how my little piece of property at the edge of world was doing. Highly unlikely.

Since I don't think anyone can nail it down, here are descriptive phrases you might want to think about:)

-Automatic algorithmic evaluation of factors
-Not based on human review whatsoever
-Correct or "incorrect" tipping of scales on Google's part due to x number of factors

Re-evaluate again parts of you website that Google can and does "see".

For the most part my reasoning was at the time that if there are websites that have recovered without changing anything whatsoever on their website, then it could be likely that x off factors, which have resolved themselves, might have been the cause. Otherwise, by some deductive reasoning, since this is filter applied to something, then obviously something had to trigger it. (Hope that makes sense)

tedster




msg:3587661
 12:54 am on Feb 29, 2008 (gmt 0)

Here are a few of the factors that the spam detection patent mentions:

"...grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

"...whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup."

I would suggest looking to see if your pages are using the penalized term and its naturally co-occuring phrase in too many of these ways.

arubicus




msg:3587855
 9:27 am on Feb 29, 2008 (gmt 0)

Here are a few of the factors that the spam detection patent mentions:

"...grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

"...whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup."

I would suggest looking to see if your pages are using the penalized term and its naturally co-occuring phrase in too many of these ways.

Hard to know because it isn't necessarily a specific penalized targeted term being 950'd. As in a targeted term 950'd but other terms still rank. Pretty much nothing ranks and seen many instances where targeted and non targeted stuff will bring a page up 950'd. It has been ever changing for us so it is hard to pinpoint while also acting somewhat differently from page to page more than likely due to other factors. I do want to mention that this is site wide. Across many thousands of pages and different styles of writing and structure. In other words this is pretty vast.

Here is a format that we use for our site and the common elements of each page:

Meta Titles: All unique in that they have different title text. Some words and keywords can be found in others. When you have a section on widgets you will have multiple titles with wigets in it. But not all. and sometimes do and don't contain targeted phrases.

Some titles do and don't contain all keywords targeted.

The title is usually the same as h1 Tags. Have the ability to vary it if we want. But makes no sense for articles to do so. Even at that this is a very common practice for those to match p. If the titles are long we will shorten them up and leave the longer version for the h1 headings

Example of our titles would be:

Do You Know the 4 Tips That Will Help Your Child Read Sooner?

Keywords: Gone so no worries there.

Descriptions: Vary and not necessarily contain exact phrases. Sometimes split sometimes not. Sometimes just a fraction of the phrase. Some contain more complete version of our target if the title/h1 do not contain all that we wanted. No description is the same and webmaster tools reports one short description.

h1 tags: Most generally the same as the title. We are experimenting with different variations between the h1 and title. One thing we could test is drop a keyword out of a phrase completely. Our other tests have used same key phrases but mixed around a bit.

Navigation: Before redesign it was keyword heavy but really not that bad. I see far worse now from better known sites to junk sites listed in the top 50. -- We are completing our changeover to a new navigation system where we won't repeat if possible. It does deepen the site a bit. The link text in the navigation is different than the titles of the page being linked to but yes they may contain the same keyword. Many times how can you not. Lets say I have a section called Recipes. I will probably use that as the navigation link text and use a more descriptive title like "Recipes For All Occasions" (just an example off the top of my head). In the h1 tag of the Recipes page...Well ya know you are going to use the term again. Again experimenting with variations.

h2/h3 tags: Use them when necessary. Some contain the keywords and some don't. Sometimes intentional sometimes not.

b/i/u: Never really use any of these to highlight keywords.

Related Discussions and Articles: Those are on the bottom of articles/discussions. Links only and no descriptions of articles. Yes some may contain keywords. If you are in a widget recipe section you will have titles from related pages that will have widget or recipe in it. You just can't help it much. To vary such titles would require a separate link text for all pages that show that related article designed specifically for that page. Again this is common practice especially database driven sites and helps keep visitors. Can't help discussion link text as those are user generated. Article links are the title of the article.

Footer links: just home/about us/privacy/copyright nofollowed.

We also, but not always, use words and keywords in URL's. Now we are looking hard at this because if you STRIP all html out of the page you can see repetition you normally wouldn't see viewing the page through a browser. Articles however my or may not contain enough keywords in urls (found in links on those pages) to really make a difference. Now comparing keywords in URLs to title and h1 and on page yes can be similar - On the other hand our discussions do NOT have keywords in the URL. I can change this for articles which would reduce keyword repetition in an instant (just change or scripts a bit) then we fall into NEW url across the entire site then go back to 301 redirects to compensate for all the links to our site. If it is a penalty or filter how much would pass through. This would be a last resort after the design slow roll changeover and re-bulking of new original content.

Links to our site: Natural over the years. Varies from crap to quality. Multi-topic site so links and anchors vary to the home page but more related links to sub-pages.

-----------------------------------------------------

Do a search for a 3 word keyphrase. Count how many times in the top 200 you will find a site that has on pages factors of Keywords in url + title + Description + keywords + h tags + various link text + urls. You will find quite a lot. Thats just the top 200.

-----------------------------------------------------

To make matters worse now - Caches of our pages are dropped. Google know about the pages but no caches. No caches = no listing which then = can't really test anything. At first I thought was because of thinning out our content but this is site wide not just new design. - To point out Google crawls our site everyday.

-----------------------------------------------------

We don't cloak, don't use sneaky redirects, don't buy links, sell links, trade links, nor use link schemes, we do have a mixture of submitted content and our own, no cookie cutter affiliates, don't duplicate content on other sites we own (none), no hidden text, no virus or malware, no sub domains, no hidden links, no nothing. Headers are fine. www/non www redirect has been in place for years. Redesign years ago that had URL changes (removed the 301s last year as they were not needed any longer). Use a nofollow on submitted articles but not articles we write for fear of "gaming" for letting PR flow through links on those articles that may or may not be submitted elsewhere.

Really can't think of anything else. Oh yeah and virtually all of our site validates CSS/HTML 4.01 Strict both old and new design (old has a few warnings but nothing bad and still passes). Just can't think of anything THAT BAD that we have done do get a spanking like this with a big paddle having 950+ nails in it.

bwnbwn




msg:3588019
 2:48 pm on Feb 29, 2008 (gmt 0)

"There is no human review to 950, Steveb is absolutely correct"

believe what you want to you have no proof of this but if you read Google's patient it adds human review as part of the overall process.
be if 950 be it 30 be if 60 who knows all I know is there are so many variables to this it is impossible to be all algo no way.

Got to be human as well. I am not saying it is totally based on the human review but I am saying there is a factoring in the algo that uses the review as part of the ranking and or filter.

[webmasterworld.com...]

Supports my statement and I feel without any doubt this helps explain why the 950 can't be explained.

10,000 people can review a bunch of sites per day enter a score go on.
I know there are millions of sites but if the system only pulls the top 20 then guess what the number to review is cut to a usuable number and Google knows 99.99999% of the users very seldom get past the 2nd page of results so why worry about them focus on the top 20 per say.

arubicus




msg:3588351
 9:21 pm on Feb 29, 2008 (gmt 0)

Anyone have any ideas?

steveb




msg:3588474
 12:09 am on Mar 1, 2008 (gmt 0)

"why the 950 can't be explained."

Since when can't it be explained? The randomly inappropriate application is a problem more than a mystery.

petehall




msg:3589525
 11:15 pm on Mar 2, 2008 (gmt 0)

950 penalty will vanish as soon as the new pages are indexed.

No manual intervention required - other than to come up with a new, non-problematic website!

Our home page returned within 24 hours, the others are appearing on a daily basis as deeper crawls are made.

GF_Diablos




msg:3589749
 10:28 am on Mar 3, 2008 (gmt 0)

Can anyone help me out here, I have read so much conflicting stuff about this 950 penalty that I don't know what's true anymore.

Our site (ranked bad for a year now) used to rank well for its targeted terms, now it ranks 1 with sitelinks for the brand and not in top 1000 for ANYTHING else.

I have tried so many things to make this work and I am ready to give up. I have dialed SEO up and down, done a redesign and slowed link building but nothing matters to it.

[edited by: tedster at 4:48 pm (utc) on Mar. 3, 2008]

This 212 message thread spans 8 pages: 212 ( [1] 2 3 4 5 6 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved