homepage Welcome to WebmasterWorld Guest from 54.161.192.130
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 212 message thread spans 8 pages: < < 212 ( 1 [2] 3 4 5 6 7 8 > >     
Google's 950 Penalty - Part 13
potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 11:01 am on Feb 9, 2008 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

I did all those type of navigation link changes but it didn't improve anything for my site. I've waited a few weeks for Google to crawl it completely, and the cache today shows about 99% of pages cached with no keyword repetition in successive navigation links.

I'd like to know if a few "bad" apples (pages) keeps an entire site 950d.

I've removed all footers, and no progress, either. I'm thinking the only thing left to remove are headers.

The old navigation structure of my site has headers which are one-word keywords. There are about nine in all.

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

But for each of the directories, i.e:

http://www.example.com/keyword1/

there is still repetition of the horizontal header nav link in the vertical menu:

e.g:

Keyword1 Keyword2 Keyword3 . . . Keyword9

Keyword1 Widgets
Red
White
Blue
...

I had thought or at least hoped having the same link with the same anchor text on the same page wouldn't get penalized. But the 950 is so a-retentive, it could say, "It's SPAM!"

Obviously it's going to look silly if I remove the header link that is used in the vertical menu, so do I have to remove the vertical menu link instead?!

That's just bad site structuring.

I HATE THIS 950 POS!

I know that many people have got the 950 lifted by doing what you said, removing the spammy links, but in early discussion about the 950, there was talk about phrases.

"Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking?"
[webmasterworld.com...]

"I'd say the new Spam detection patent is a good candidate. It certainly does re-sort the results and it also has the potential to generate drastic changes for just some search phrases and not others."--Tedster

"You can get your page included in the spam table by having too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy and anchor text that might be worth an experiment. The threshold for "too many" can also be fine-tuned in this algorithm, which would create the "popping in and out of the penalty" quality that some people report."--Tedster

So it's possible multiple things could assassinate your site. And #*$! are you supposed to do if the way you write naturally triggers the 950? Re-write every frickin' page? Get somebody else to?! Look at every competitive phrase on every page and remove/change it? My site has over 1,000 pages. I could fiddle around with the text on all 1,000 pages and still be 950d. At least with anchor text you can search and replace an entire site fairly quickly.

Re: too many occurances of semantically related phrases. This certainly suggests some modifications for both body copy...

Just one other comment. Google, to apply its phrase-based spam detection filter, needs a list of semantically related phrases. It has one, obviously, or the filter would be useless. But that list of phrases is a secret, right?

Well, not exactly. It's only partially hidden. Google reveals it at the bottom of SERPs, those sets of 'suggested' related searches. I confess when they first came out, I targeted them.

That could have resulted in Google interpreting my site to have, as Tedster put it, "too many occurances of semantically related phrases."

I didn't go crazy by targeting 1,000 different popular search phrases, but if the threshold was quite low (under 25), it could have poisoned my site.

Has anyone here not got the 950 lifted by anchor text changes, but only by phrase changes?

p/g

"The good thing about being at the bottom of SERPs is you don't worry about being overtaken."--p/g

[edited by: tedster at 9:06 pm (utc) on Feb. 27, 2008]

 

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 10:07 pm on Mar 4, 2008 (gmt 0)

arubicus:

Descriptions: Vary and not necessarily contain exact phrases. Sometimes split sometimes not. Sometimes just a fraction of the phrase. Some contain more complete version of our target if the title/h1 do not contain all that we wanted. No description is the same and webmaster tools reports one short description.

Does each meta desc. tag contain at least 155 characters?

You might try taking the first 155 characters right from the article body.

Before redesign it was keyword heavy but really not that bad. I see far worse now from better known sites to junk sites listed in the top 50.

The 950 situation in Google is not dependent on how spammy other websites look. As many have pointed out here, it appears to be a trigger of some type that is often falsely applied.

Comparing what you have to what is out there and ranking will not help you solve your issue.

You may consider using no keywords at all in the link to those pages:

Recipes:

All Occasions
Christmas
Thanksgiving

Remember that your goal is obviously to pull out, which might mean de-optimizing now, building trust and testing against the tipping point.

Do a search for a 3 word keyphrase. Count how many times in the top 200 you will find a site that has on pages factors of Keywords in url + title + Description + keywords + h tags + various link text + urls. You will find quite a lot. Thats just the top 200.

Again, we are not talking about other websites, we are looking at yours :)

I have a website in the top ten that does all of those things, but has trust and did not trip this filter. What works for one website in terms of rankings and penalties does not always work for another.

we do have a mixture of submitted content and our own

Even though they are nofollowed, are the submitted content duplicated elsewhere?

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 10:13 pm on Mar 4, 2008 (gmt 0)

believe what you want to you have no proof of this but if you read Google's patient it adds human review as part of the overall process.

No-one is saying that Google doesn't have reviewers as far as I can tell.

The 950 can be explained insomuch as it is a tipping scale of various phenomenon on any given website that can cause a re-ranking.

Trying to explain the inner details would be like telling you exactly how Google ranks websites verbatim which is impossible without inner intimate daily knowledge of the system.

The facts are that websites who fix one of the factors on their website generally bounce back in a methodical way to higher positions first, and then back to real life positions in time. Those changes happen when elements that are 'breaking the back' are changed in a way where the total score fits. Those movements happen when those pages are re-cached with those changes.

If those websites pulled out of 950 by manual intervention, then the movements would no correspond with cache dates.

The fact that it appears that the 950 doesn't apply correctly to every website is a non-point - this is like saying that Google doesn't rank every website in it's index correctly.

arubicus

10+ Year Member



 
Msg#: 3586434 posted 11:26 pm on Mar 4, 2008 (gmt 0)

First of all thanks for the reply

Does each meta desc. tag contain at least 155 characters?

Yes. We have a limit of 255 if we need but usually rests around the 200 mark. Yes some do have less than 155. All descriptions are unique. Depending on the page...some contain keywords and some don't necessarily. I could pull the first "so many" characters out of the first paragraph don't know if there is a need just yet.

I have a website in the top ten that does all of those things, but has trust and did not trip this filter. What works for one website in terms of rankings and penalties does not always work for another.

What I am getting at is that combined those HTML elements can be PART of the factor since these ON-PAGE elements seem to be similar and fairly static to compare to, BUT there then has to be something else ADDED to it that can trip this filter. TRUST/LINKS and off page factors for example or phrase based filtering. We have been testing our unique pages, new and old, through most of last year for on page factors really shown little if no improvement. New articles would be 950'd right off the bat regardless of on page factors.

Now links isn't much of a problem. Site wide we have a few thousand links from crap to quality. Can't help the crap and scrapers when you ranked well for years for a broad spectrum of keywords.

Pages we have been monitoring I can tell you that we have many pages with incoming links that far beat other sites that rank in the top 100. Yes quality on topic natural links. Rather than a 950 this should land us at LEAST in the top 300. This is for even the least popular of keywords.

Even though they are nofollowed, are the submitted content duplicated elsewhere?

Yes and no. We aren't as concerned about submitted articles as much as we are for our own unique articles. We are going to filter these out but will not cut them completely as of yet. When content was submitted to us we made sure we were the first to display and have the content indexed. We always ranked above any duplicates. Most duplicates were omitted. But again we ARE and are continuing to filter much of those articles out and no longer taking external articles at the moment.

Robert Charlton

WebmasterWorld Administrator robert_charlton us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3586434 posted 11:53 pm on Mar 4, 2008 (gmt 0)

I'm not sure that a site's having "trust" prevents the filter from getting tripped. A thought I've posted previously is that having a certain level of trust is what keeps the page of a site in the visible index, at minus 950, rather than having it dropped.

I do think that enough keyword-relevant "trust" for a page is one of the things that might keep a page from tripping the filter. The problems I've seen are very much page and query-specific.

Miamacs posts some very intriguing thoughts on this current thread...

Can a single link take you down? Went from #7 to #1 to #98
[webmasterworld.com...]

typical when the inbound link anchor text and the navigation link text says something different, both use competitive phrases and probably don't even fall within the same theme ( category ).

if you go back half a year I've warned everyone that getting high trust links with different anchor texts ( especially of different themes ) will send their pages to -950 for those phrases which they chose on their own for the nav links.


CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 1:07 am on Mar 5, 2008 (gmt 0)

What I am getting at is that combined those HTML elements can be PART of the factor since these ON-PAGE elements seem to be similar and fairly static to compare to, BUT there then has to be something else ADDED to it that can trip this filter.

Absolutely, and even the way you interlink the articles, or the sheer number of closely related articles with so similar of titles OR too similar of the SAME type of factors could do it.

Have you seen any improvement at all since pages were cached in Google?

Have you taken a look at the content on the website and how it is phrased and written?

arubicus

10+ Year Member



 
Msg#: 3586434 posted 1:59 am on Mar 5, 2008 (gmt 0)

When the site is cached I have seen some improvements to certain pages yes. With the navigation structure changed and most elements the same. Few re-wordings of some elements.

Miamacs has mentioned (Thanks for the link Robert):

[webmasterworld.com...]

For more severe problems -950 is more like -450, burying you in the middle. For less severe you get -20/ -60 /-120. Every single link will have a visible, measurable effect making the site come back in huge leaps.

We seen pages jump from -950 to the 400 range. We seen pages that were linking to articles in the 400 range with the exact article being 950 change to the exact article at 400 and no 950. Just something seems to be clicking in that post to me.

Another quote from that post here:

Your page has a trust score, you get an incoming link that increases that score. If for some reason Google later labels that site as a link scheme - that page stops passing pr and your trust score is lowered. If the amount of trust your page loses in a certain time period crosses some predefined threshold - you get slapped with a penalty.

almost correct.

except your trust and pr is never lowered.
it's more like... 'put on hold', it's overridden, but not lost.

It seems as if I have NO internal pr flow. Pr hasn't been lost for years for our home page but for anything beyond it has been a hellish ride.

Have you taken a look at the content on the website and how it is phrased and written?

Any suggestions on what to look for here?

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 2:40 am on Mar 5, 2008 (gmt 0)

Does anyone 950'd have comments about the average webpage length? Are sites with long pages getting penalized?

p/g

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 2:56 am on Mar 5, 2008 (gmt 0)

Even the way you interlink the articles, or the sheer number of closely related articles with so similar of titles OR too similar of the SAME type of factors could do it.

In all these penalties you have to step away from what has happened to your site and ask, "What is Google attempting to do?"

Think like a spam detector. What would you look for?

Spammers try to shotgun blast to get as many different related words and phrases as possible. That, in my opinion, is the basis of the 950/phrase-spam penalties.

You may naturally write web pages which use many similar phrases, either because you have a long article, and it's inevitable to use most of the similar phrases, to avoid sounding repetitive, or because you're trying to get more traffic like spammers. You actually have a decent page, and you're not a spammer, but like me, you write pages trying to "pick up" more related search phrases, because that's a service to the public, helping them find what they want.

The problem: Google cannot differentiate between the spammer who uses all related search phrases and the webmaster whose writing style or SEO methods "blanket" many similar phrases.

There really isn't anything inherently wrong or unethical with "going after" similar search phrases. But now Google's phrase penalty causes lots of collateral damage.

There is nothing wrong with a website with depth and breadth, that goes into a lot of similar, related content. But now Google can kill it because there are too many similar phrases.

The bottom line is you have to know what spam sites look like so you don't look like one.

p/g

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 3:29 am on Mar 5, 2008 (gmt 0)

"The bottom line is you have to know what spam sites look like so you don't look like one."

But that's not a good solution. There are different types of spammers. 950 obviously is not attacking every type of spam. The type of spammer 950 is attacking, imo, are the spammers who are trying to look like authority sites, but are not... most obviously again the pages on hacked edu domains.

So the goal of trying to not look like a spammer is actually a goal to not look like a high quality, authority page. That's not a good goal.

950 is a tough penalty in part because it is an ambitious one. It penalizes pages that score very well. Pages that would normally rank #67 seldom have 950 problems, while at the same time, ranking #67 is more or less the same as ranking #950. If making your pages medicore was the solution, you would not gain much anyway.

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 6:24 am on Mar 5, 2008 (gmt 0)

almost correct.

except your trust and pr is never lowered.
it's more like... 'put on hold', it's overridden, but not lost.

Absolutely both are lowered, they 'can' completely dumped in some cases to the point where you have to completely rebuild them from scratch.

In terms of page content, sometimes it can be a matter of taking one -950 page and trying to write it completely naturally, as if you are not trying to optimize it.

Then point a few fresh links at it and watch it on the next cache.

arubicus

10+ Year Member



 
Msg#: 3586434 posted 3:23 pm on Mar 6, 2008 (gmt 0)

Ok now we got caches returning to many of or pages. March caches so that is good. Even though they are cached putting the exact url in the search box yields no results. Going to sit back and see what happens in a few days to see if some of those pages returns to the 950 range.

brinked

5+ Year Member



 
Msg#: 3586434 posted 4:52 pm on Mar 9, 2008 (gmt 0)

I am happy to report that my site is finally out of the -950 penalty. More specifically, my sites inner pages were all dead last for their respective terms. After much research I determined this had to be an "over optimization penalty" and turns out I was right. I had the same phrase in my page title and H1 tag...all I simply had to do was remove the <h1> </h1> wrappers and now less than a week later...Freedom.

Advice to everyone here...if you are suffering from this penalty or something similar..try not to resort to starting your site from scratch as I have seen mentioned here. You don't have to resort to drastic measures to get out of a penalty. Here are some pointers that hopefully will help out some people.

- what was the last change you made to your site before you were penalized?
- Thoroughly go over your site to make sure you aren't over using the same phrases.
- Don't panic and start over, start by changing 1 thing at a time.
- If you are in some sort of penalty, most likely it is an on-site problem, don't try to rectify the problem by messing with backlinks...focus on the site itself.
- Google is not stupid. Google wants to give rankings to natural sites. Don't go too crazy with on-site seo, rather focus on quality content.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 9:34 pm on Mar 9, 2008 (gmt 0)

Work release isn't freedom. You can fairly safely say you are out of the penalty about six months from now. It's normal for people to go in and out of the penalty, especially after tweaking any page in any way.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 1:28 pm on Mar 10, 2008 (gmt 0)

I determined this had to be an "over optimization penalty" and turns out I was right. I had the same phrase in my page title and H1 tag . . . all I simply had to do was remove the <h1></h1> wrappers and now less than a week later . . . Freedom.

I agree with the idea it's an OO penalty (confirmed by Matt Cutts), but isn't it fairly common and natural in basic web design to match the title tag with H1?

The "H" in H1 stands for Heading, of course. You could argue it's technically different from a "Title," even if it seems like splitting hairs.

I just checked one news site, and it puts the category of "Politics" as its H1 tag for an article on politics. Another big news site, however, uses the article title as the H1 tag.

Brett has this section of Webmaster World set up so a thread page has:

<h1>Google Search News</h1>

The thread title is the title tag. And the thread title itself doesn't even appear in any H tags (H2, H3, etc.)

e.g.

<b><font size="2" face="arial" color="#ffffff">Google's 950 Penalty - Part 13</font></b>

Has anyone else got the 950 penalty lifted by removing H1 tags?

I'm concerned Google could see repetition of an H1 category name in many pages as SPAM. But I wouldn't be surprised if matching tags make Google suspicious. They look auto-generated.

Removing matching tags sounds reasonable, but on your pages you should replace your old H1s with something, right? Esp. if the pages already have H2 tags.

p/g

Alex70

5+ Year Member



 
Msg#: 3586434 posted 2:23 pm on Mar 10, 2008 (gmt 0)

I hardly believe that the same Title and H1 can alone cause this penalty, but if you also have the majority of IBL's with the same anchor text its more that a possibility.

almir

5+ Year Member



 
Msg#: 3586434 posted 4:07 pm on Mar 10, 2008 (gmt 0)

For those that think that 950 or anyother penalties doesnt have a link with manual bans ( sometimes, not always ) read widgetsdirectory blog, and you'll see on his post and example that widgetsdirectory has been banned manually.

[edited by: Robert_Charlton at 6:16 pm (utc) on Mar. 10, 2008]
[edit reason] removed specifics [/edit]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 4:10 pm on Mar 10, 2008 (gmt 0)

Yes, there definitely are manual bans and penalties at Google - but I don't think that the -950 or "end of results" re-ranking is one of those.

arubicus

10+ Year Member



 
Msg#: 3586434 posted 6:41 pm on Mar 10, 2008 (gmt 0)

Has anyone else got the 950 penalty lifted by removing H1 tags?

This is one thing I thought about testing but found too many examples that contradict this idea. It is common in most blogging platforms and CMS's that the h1 matches the title of the page.

It could be that too many elements using a specific phrase or keyword(s) could be a possibility. Thus removing one of those possibilities based on "weight" of the element could cause a bounce back. In other words the title and h1 elements carry huge weight so removing the h1 element can reduce such weight.

Now on the other hand I have a few pages tested with different title and h1. No go so far. The h1 is still there just different wording. A few tests now also have completely different title and h1's. Still no go.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 7:05 pm on Mar 10, 2008 (gmt 0)

I mentioned this in the -950 Quick Summary thread [webmasterworld.com], but it's probably worth repeating, as I think the idea is holding up quite well. Also it springs from study of Google's patents combined with some practical examples of fixes.

Key points that can make this -950 difficult to "fix":

1. The re-ranking is triggered by crossing a threshold.

2. The threshold can be different for different search terms.

3. The threshold can be different for different markets or website taxonomies.

4. The threshold is set by measuring and combining many different types of
mark-up and grammatical factors, and not by absolutely measuring any
one factor.

5. The threshold is NOT set absolutely across all web documents. So phrases
in the travel space can be held to a different measure than, say, phrases
in jewelry e-commerce.

The patents suggest scoring all kinds of areas, for example:

"[0042] ...grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

"[0133] ...whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup." Note that measurements are suggested here for position on the page.

Going over the top with a "de-optimization" effort could deflate your pages to the point where they NATURALLY should rank at 950! So use a gentle touch, record your changes - and know that if you are just barely over some threshold then it might not take much to move you back.


JeffOstroff

5+ Year Member



 
Msg#: 3586434 posted 9:25 pm on Mar 10, 2008 (gmt 0)

Tedster:

"When the patent states [0042]: grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks."

Are you saying that this means it COULD be considered spam if you put a search phrase as bold or in quotes, or as an H1 tag?

[edited by: JeffOstroff at 9:25 pm (utc) on Mar. 10, 2008]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 9:38 pm on Mar 10, 2008 (gmt 0)

Not exactly. The patent is discussing all kinds of factors that when present beyond some measured threshold level would be a sign of spam - or we could say "too much SEO." Clearly, featuring keywords and related phrases in some of these ways is only natural, unless it's done in an excessive fashion.

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 10:30 am on Mar 11, 2008 (gmt 0)

I think the 950 algorithm might add a combination of SEO issues into a final "Likely Spam" score. When your Overoptimization Score surpasses the threshold, i.e., the maximum amount which Google arbitrarily considers "not spam," you're 950d.

So the 950 algo looks at each sub-issue of the known spam issue. All the different tags, etc., H1, H2, bold, etc., i.e., all the small things you do to optimize your page, and then adds them altogether for your "quality score."

The problem is it uses so many SEO variables and it's so specific to so many search phrases, that you can't easily, if at all, isolate the variables to analyze them and make changes. It's like thick soup--utter chaos! Extremely complicated.

I think one of the problems is webmasters get into the habit of optimizing many small things, because it's so easy to do, but the cumulative effect of optimizing everything is "overoptimization."

Everything adds up... into a 950 smackdown.

p/g

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 1:02 pm on Mar 11, 2008 (gmt 0)

> Going over the top with a "de-optimization" effort could deflate your pages to the point where they NATURALLY should rank at 950! So use a gentle touch, record your changes - and know that if you are just barely over some threshold then it might not take much to move you back.

Not so sure. The risk of getting 950d far outweighs the presupposed benefits of many small SEO ideas (tags, bold, italics, etc.). I don't know that I'm going to bother with the little stuff any more. Even before the 950, I had serious doubts how much value there was in using italics, or H4 tags, or whatever.

Google has already virtually dismissed the Keyword tag, apparently doesn't care much for the Description tag, so why should we assume based on that attitude that H1, H2, bold, italics, etc., have any value? I can't say that I've ever seen a site that bothered with this low-end SEO stuff doing much better than one that didn't. It's all about inbound links and title tags.

Choose a good title tag, get the IBLs, and fahgettabout the other stuff. More trouble than it's worth IME and IMO.

I'd bet you that one site with great IBLs, reasonable amount of content per page, and a title--but no H tags, underlined text, bold text, and keyword-stuffed anchor text--would get better SERPs than a site with all this on-page over/optimization, but no good inbound links.

If you stop and think about it, there's no good argument why Google should weight "perfectly designed" pages with "great" on-page optimization better than sites with great inbound links. Or, for that matter, why it should give those on-page factors much value at all.

p/g

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 2:34 pm on Mar 11, 2008 (gmt 0)

Forgot to add:

I recently got one site's 950 penalty lifted. At least, it's been about a month, and I don't see it at the back of the line, and it's getting Google traffic for some good phrases.

Why? Not sure. I made the following changes, any of which, or any combination of which, could have lifted the penalty:

1) More content (incl. more pages and more text on those pages)--also a new section with 30 pages;

2) Removed all footer links (did this a while back, but it didn't help);

3) Changed navigation structure to standard (left vertical menu, Home at the top);

4) Removed Description and Keywords tags.

p/g

ALbino

10+ Year Member



 
Msg#: 3586434 posted 3:30 pm on Mar 11, 2008 (gmt 0)

My site has been -950'd for over a year now and we've never used H tags, so it definitely doesn't directly correlate to that. Maybe we should add some and see what happens? :)

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3586434 posted 5:19 pm on Mar 11, 2008 (gmt 0)

What about Keywords/Descriptions?

p/g

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3586434 posted 5:27 pm on Mar 11, 2008 (gmt 0)

keyword/description are meta tags - not visible on the page - so it makes some sense not to score them for ranking purposes. They both were so heavily spammed in the 90s that search engines could not hear any dependable relevance signal there. In order to improve the quality of results, they needed to move on.

There was a period wher H1 was also heavily abused - it may not be so heavily scored as it once was, but it certainly is one place to look for "spam signals".

ALbino

10+ Year Member



 
Msg#: 3586434 posted 6:23 pm on Mar 11, 2008 (gmt 0)

I have keywords/descriptions but they're not what I would consider spammy. The description is identical to the product description located on the pages, I would think changing it to something different would be more spammy, but maybe I'm wrong.

arubicus

10+ Year Member



 
Msg#: 3586434 posted 6:41 pm on Mar 11, 2008 (gmt 0)

ALbino

We used combinations of descriptions found on page and completely unique. Makes no difference as far as I can tell.

arubicus

10+ Year Member



 
Msg#: 3586434 posted 8:24 pm on Mar 11, 2008 (gmt 0)

Anyone tell me why this is...

If I use the site: command for a specific url the page is found (site:http://www.?.com)

If I just put the url in there the page isn't found

If I search the title (no quotes) it brings up a page linking to the page -950 of course

If I search the title of the page linking to the page I was search for I get a -950 version of the I was originally searching for.

topplacement

5+ Year Member



 
Msg#: 3586434 posted 11:09 pm on Mar 11, 2008 (gmt 0)

I am new to these forums but I am an SEO!
I can see a patern emerging here and I would just like to say this, If you think you have optimised your site perfectly and you are still being penalised then you are probably over optimising. Over optimisation will have the same effect as any other penalty. My advice is tone it down and decrease the weight but not the prominence.

This 212 message thread spans 8 pages: < < 212 ( 1 [2] 3 4 5 6 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved