Forum Moderators: open

Message Too Old, No Replies

h1, h2, h3 penalty revisited

Does using a kw in h1, h2, h3 cause a penalty

         

steve

1:40 pm on Jun 25, 2004 (gmt 0)

10+ Year Member



A while ago in [webmasterworld.com...] I asked whether using a kw in H1, H2, and H3 could cause a penalty.The general agreement was it would not.

However, I decided to experiment a bit. Using the kw once and only in H1, the page which was nowhere in the serps has now reappeared.

From my *very limited* experiment it does appear overuse of kw's in <Hx> can trip a filter.

This is backed up by other pages in my site. Where I have previously used CSS to create titles, called 'maintitle', 'sectiontitle' etc. They have vanished from the serps when changed to <Hx>

jo1ene

1:56 pm on Jun 25, 2004 (gmt 0)

10+ Year Member



Using header tags is a natural and appropriate way to organize the textual content of your page. If I have a article about SEO, wouldn't a user expect to see "Search Engine Optimization", or some such thing in big, bold letters at the top of the page? And what's the accepted standard for big, bold lettres? <HX></HX> Right?

Why on God's Green Earth would Google penalize what W3C recommends, anyway? There are soooo many on-page and off-page factors with the SERPs, it impossible to test for them all.

Your experiment falls into the "post hoc, ergo propter hoc" falacy. It happened after this, therefore it was on account of this.

Patrick Taylor

2:12 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From my *very limited* experiment it does appear overuse of kw's in <Hx> can trip a filter.

There's surely no harm in experimentation as long as no false conclusions are drawn. I don't know what Google does but it would make sense not to reward H tags that are repeated many times with the same keywords, otherwise its overuse might be encouraged and the tag itself would cease to have as much structural meaning (which would presumably be going against W3C's aims).

DerekH

2:19 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I go with jo1ene, but I go further.

One of my pages in number one for the keywords I want, and I have used H1, H2 and H3 to divide up the page into a hierarchical description of the page...

For example, it's a bit like this...
<H1>My widgets page
<H2>Widgets - a special case of widgettes
<H3>All about my Widgets

steve's question is phrased in such a way that it gives the impression that Google "knows" that he's put keywords into the headers. Keywords are simply ordinary words that you want people to use in their search terms, they aren't special in any other way.

I see no reason why steve's argument could hold water and I can certainly give you the counter-example on my site which shows using the H constructs properly is an excellent thing to do...
DerekH

steve

2:37 pm on Jun 25, 2004 (gmt 0)

10+ Year Member



impression that Google "knows" that he's put keywords into the headers

Prehaps goggle looks at title, Hx, and body text for matching uncommon words?

Certainly my pages that use CSS to format headings instead of Hx do better in the serps.

Interestingly, one that uses <Hx> ranks higher for a phrase that it is not seo'd for than for one which it is.

Patrick Taylor

2:38 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



DerekH, I'm not actually going with anyone in particular. However, I've seen sites (as I suppose many others have) with maybe 20 instances of H1 stacked up together with exactly the same phrase, and I would hope that isn't being rewarded more than if the tag with that phrase was used only once.

Of course I can see why you might want variants on several H tags in the normal course of things, and wouldn't argue against it.

Also I believe personal experimentation and further enquiry is a good thing in principle.

steve

2:42 pm on Jun 25, 2004 (gmt 0)

10+ Year Member



Your experiment falls into the "post hoc, ergo propter hoc" falacy

Possibly, I changed nothing else on the page. But it could have been affected by off page factors.

I agree <Hx> is the best and proper way to structure pages, but if Google thinks it too seo'd it might throw the page out.

g1smd

3:43 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I use headings just as they are supposed to be used,.... for headings.

I have never had a problem with that. In fact, I have had some very good results.

steve

3:51 pm on Jun 25, 2004 (gmt 0)

10+ Year Member



I use headings just as they are supposed to be used,.... for headings.
I have never had a problem with that. In fact, I have had some very good results

Do you have your kw in every heading, or do you mix things up a bit?

i.e. <H1>Red Widgets <H2>Feeding red widgets <H2>Caring for red widgets

or

<H1>Red Widgets <H2>How to feed 'em <H3>Caring for them

MHes

4:32 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The issue is title+hx match, on a competitive keyword phrase, along with possible anchor text exact match. Too much can be bad.

W3C has nothing to do with good spider food.

ogletree

4:48 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



MHes that is not true and has never been proven. I rank for things doing just that. I have a site map that ranks real well for a term because the page has the keyword in 80 links. I also rank for things that do exactly what you say. There has never been an overop filter. There is just a "I suck at SEO filter"

pleeker

4:55 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is just a "I suck at SEO filter"

LOL! :)

BigDave

5:01 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In the thread subject you mention a penalty, yet in your first comment you mention a filter. The first thing you should do is realize that there is a big difference between a penalty, a filter and general algorithmic weighting.

A penalty is what happens when there is overwhelming evidence that you are trying to cheat the algo. These are almost always manually applied or applied to groups of sites that use a specific trick du jour.

A filter will look for certain things and only let those that pass certain requirements through. I could definitley see them having a filter to reduce the raning of a page due to what seems to be unnatural overuse of the kewords in the header tags, but that would require a lot more than just putting it in one H1 tag.

Algorithmic weighting is just what google considers important in a page at any given time. This can change several times a month. Unless you can be sure that your two pages are completely identical in every way (on and off page, at the same time, with data from the same crawl), other than the factor you are testing, you cannot possibly be sure that the factor you are looking at is the reason for a difference in ranking. You can start making an educated guess when your sample size is thousands of pages, but it is still a guess.

Now what might be the case is that just adding that one header with just the keyword in it, tipped the scales and made that page appear to be a bit unnatural. But you could just as easily blame any other factor that also contributed to that situation. What if you remove the keyword from the title, or mixed up your anchor text a little bit?

MHes

5:26 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"I suck at SEO filter" = too much hx+title+anchor etc.

IMHO Moderation is the key for seo

pleeker

6:05 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



IMHO Moderation is the key for seo

I would clarify that to say moderation is a key for SEO where Google is concerned. But moderation doesn't work so well with Yahoo and other SEs.

MHes

6:44 pm on Jun 25, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sure Steve is the first to accept that 'tests' on serps and algo's that are forever shifting is never conclusive.

Site map style pages do often rank well, mostly because of the repetition of a keyword in the anchor text. But these pages may not be tripping the filter we are talking about. If the filter is triggered, other factors may push the page up the serps. I believe the filter, if triggered, just means the phrase or keyword is ignored or the h1 factor is dampened. If Hilltop is being applied, the page may still do well, without the benefit of the Hx which has been ignored or dampened.

Steve may well have demonstrated that by reducing the repetition of the keyword in commonly used seo tactics such as hx, title, anchor etc. he has avoided the filter and the hx is now providing him with the boost required to get back a good position. Other factors are of course important, but nevertheless it is interesting that the change he made may have had an effect. It is as conclusive as any other potential test and I don't think should be dismissed. Many ranking theories can be trashed as not logical or potentially unreasonable, but google cannot rank millions of pages by being logical and reasonable to everyone. They are restricted by their technology so I believe they keep things simple. This results in perfectly reasonable and relevant pages sometimes not ranking well.... lifes a ****. The position I think they have taken is that if a page has not got the relevant text in either the title, anchor or headings then it is probably not relevant. In addition, if the phrase is in all these places, it may be the result of seo awareness. Experience has dictated that they can achieve good results by taking the middle ground. They then produce a set of results. Next, they apply hilltop and other factors to reshuffle these results, thus the initial poor ranking due to the filter may improve or continue to wallow low down.

Nothing is easy to prove, but I had a similar experience where I reduced the matching phrases throughout the page and improved my position. After florida, many sites dropped and then returned a few months later without making these types of changes, but I believe that this was hilltop which helped regain their positions. Google improved the hilltop filter and thus despite sites initially dropping because of too much hx/title/anchor match, they were regaining their rankings because of 'on theme' links in from authority sites and other factors.

In short, an hx/title/anchor match does you no favours. If you are strong on other factors you may still rank well, but sites that don't trigger this filter can rank well despite not having good links in and high relevant 'local rank'. Pure on page optimisation can still work wonders, but as with everything, there is no difinitive tactic that will guarantee you better rankings.

steve

8:37 am on Jun 26, 2004 (gmt 0)

10+ Year Member



In the thread subject you mention a penalty, yet in your first comment you mention a filter. The first thing you should do is realize that there is a big difference between a penalty, a filter and general algorithmic weighting

Sorry, lazy writing! I meant to say filter.

The issue is title+hx match, on a competitive keyword phrase, along with possible anchor text exact match. Too much can be bad

This could be my problem. I have followed the formula:-

KW - once in title
- once in description
- once in <h1>, <h2>, <h3>
- once in first paragraph, last paragraph and throughout the body text
- once in url
- once in alt tags
- in inbound and outbound anchor text

The lesson is probably wind back the use of kw, get listed in the serps, then turn it back up slowly to get the serps position you want.

This being the case, where should I remove or reduce the kw from initially?

"I suck at SEO filter" = too much hx+title+anchor etc.

Yep, that sums me up. But with the help of the people here I'm trying to learn!

DerekH

12:23 pm on Jun 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Steve wrote
Do you have your kw in every heading, or do you mix things up a bit?

i.e. <H1>Red Widgets <H2>Feeding red widgets <H2>Caring for red widgets

or

<H1>Red Widgets <H2>How to feed 'em <H3>Caring for them

Sorry for the delay - I can't get this site to load properly at the moment - it keeps hanging up on me.

If you take the Hs as outliners, then the recommendations for making a good overview of your page is to use a single <H1> and then any mix you like of H2, H3 and H4, except that you mustn't miss out a level.
So H2 H3 H4 is fine, but H2 H4 isn't.
H2 H2 is fine - it simply means that the first sub-heading doesn't have sub-headings of its own.

Your comment about using CSS instead of Hx is interesting, because I actually use both - I use CSS to scale down the changes in magnification as one goes up the headings, so my H2 is only a little larger than my H3.
I regard the H construct as being no more devious, and no more straightforward than letting me label sections of my page.

Sorry for the delay in replying.
DerekH

caveman

5:39 pm on Jun 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What BigDave said.

We've proved to our own satisfaction the existence of filters that filter out pages with too much 'optimization'. We've been able to move pages in and out of the SERP's by modifying elements. The thing is, there are many elements that, WRT to this discussion, are co-dependent.

So, while it may appear to some (who focus too much on one particular element, e.g., HX) that the element is problematic, it is in fact the whole picture that must be considered, and changes to any one of element may be enough to send a page to the depths, or bring it back again...which leads to all sorts of false conclusions.

The reason some get blasted for kw implementations in HX elements is likely that they've also got too many other elements pushing the SEO envelope at the same time.

HitProf

9:52 am on Jun 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From my *very limited* experiment it does appear overuse of kw's in <Hx> can trip a filter.

Isn't the word "overuse" the trigger?

ogletree

4:25 am on Jun 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I still don't believe there is an over op filter. I have sites doing fine and they break all these rules. As a matter of fact that is why they are doing well. The key is getting a site flagged as an authority. An authority site can do whatever it wants. There are no rules. You need an old site with lots of old links.

mann

5:53 am on Jun 28, 2004 (gmt 0)

10+ Year Member



<<I still don't believe there is an over op filter>>

Believe it or not but it is, if you over optimised your site your site get penalty in term of reducing PR or No PR.

<<You need an old site with lots of old links>>

I personally agree with you as google doesn't index quickly & well to New Site.

ogletree

2:34 pm on Jun 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I said I don't believe because I have seen and done it. You can keep wearing a red shoe when you put up new pages because you ranked bad once when you forgot to wear one red shoe. Keep looking there is another reason for your problems. Google does not update everything at once there is no way to really do an effective test to this stupid theroy. If it were true then why do over opted site still do very well. I do what works and I don't follow a bunch of foolish amature advice and make a lot of money off of Google.

MHes

2:41 pm on Jun 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"If it were true then why do over opted site still do very well"

Hilltop

"..and make a lot of money off of Google."

Good for you, but that doesn't win the arguement. Perhaps a lot of money to you is peanuts to someone else :)

nuevojefe

3:50 am on Jun 29, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



An authority site can do whatever it wants.

- Definitely some elements of truth in that.