Forum Moderators: open
However, I decided to experiment a bit. Using the kw once and only in H1, the page which was nowhere in the serps has now reappeared.
From my *very limited* experiment it does appear overuse of kw's in <Hx> can trip a filter.
This is backed up by other pages in my site. Where I have previously used CSS to create titles, called 'maintitle', 'sectiontitle' etc. They have vanished from the serps when changed to <Hx>
Why on God's Green Earth would Google penalize what W3C recommends, anyway? There are soooo many on-page and off-page factors with the SERPs, it impossible to test for them all.
Your experiment falls into the "post hoc, ergo propter hoc" falacy. It happened after this, therefore it was on account of this.
From my *very limited* experiment it does appear overuse of kw's in <Hx> can trip a filter.
There's surely no harm in experimentation as long as no false conclusions are drawn. I don't know what Google does but it would make sense not to reward H tags that are repeated many times with the same keywords, otherwise its overuse might be encouraged and the tag itself would cease to have as much structural meaning (which would presumably be going against W3C's aims).
One of my pages in number one for the keywords I want, and I have used H1, H2 and H3 to divide up the page into a hierarchical description of the page...
For example, it's a bit like this...
<H1>My widgets page
<H2>Widgets - a special case of widgettes
<H3>All about my Widgets
steve's question is phrased in such a way that it gives the impression that Google "knows" that he's put keywords into the headers. Keywords are simply ordinary words that you want people to use in their search terms, they aren't special in any other way.
I see no reason why steve's argument could hold water and I can certainly give you the counter-example on my site which shows using the H constructs properly is an excellent thing to do...
DerekH
impression that Google "knows" that he's put keywords into the headers
Prehaps goggle looks at title, Hx, and body text for matching uncommon words?
Certainly my pages that use CSS to format headings instead of Hx do better in the serps.
Interestingly, one that uses <Hx> ranks higher for a phrase that it is not seo'd for than for one which it is.
Of course I can see why you might want variants on several H tags in the normal course of things, and wouldn't argue against it.
Also I believe personal experimentation and further enquiry is a good thing in principle.
I use headings just as they are supposed to be used,.... for headings.
I have never had a problem with that. In fact, I have had some very good results
Do you have your kw in every heading, or do you mix things up a bit?
i.e. <H1>Red Widgets <H2>Feeding red widgets <H2>Caring for red widgets
or
<H1>Red Widgets <H2>How to feed 'em <H3>Caring for them
A penalty is what happens when there is overwhelming evidence that you are trying to cheat the algo. These are almost always manually applied or applied to groups of sites that use a specific trick du jour.
A filter will look for certain things and only let those that pass certain requirements through. I could definitley see them having a filter to reduce the raning of a page due to what seems to be unnatural overuse of the kewords in the header tags, but that would require a lot more than just putting it in one H1 tag.
Algorithmic weighting is just what google considers important in a page at any given time. This can change several times a month. Unless you can be sure that your two pages are completely identical in every way (on and off page, at the same time, with data from the same crawl), other than the factor you are testing, you cannot possibly be sure that the factor you are looking at is the reason for a difference in ranking. You can start making an educated guess when your sample size is thousands of pages, but it is still a guess.
Now what might be the case is that just adding that one header with just the keyword in it, tipped the scales and made that page appear to be a bit unnatural. But you could just as easily blame any other factor that also contributed to that situation. What if you remove the keyword from the title, or mixed up your anchor text a little bit?
Site map style pages do often rank well, mostly because of the repetition of a keyword in the anchor text. But these pages may not be tripping the filter we are talking about. If the filter is triggered, other factors may push the page up the serps. I believe the filter, if triggered, just means the phrase or keyword is ignored or the h1 factor is dampened. If Hilltop is being applied, the page may still do well, without the benefit of the Hx which has been ignored or dampened.
Steve may well have demonstrated that by reducing the repetition of the keyword in commonly used seo tactics such as hx, title, anchor etc. he has avoided the filter and the hx is now providing him with the boost required to get back a good position. Other factors are of course important, but nevertheless it is interesting that the change he made may have had an effect. It is as conclusive as any other potential test and I don't think should be dismissed. Many ranking theories can be trashed as not logical or potentially unreasonable, but google cannot rank millions of pages by being logical and reasonable to everyone. They are restricted by their technology so I believe they keep things simple. This results in perfectly reasonable and relevant pages sometimes not ranking well.... lifes a ****. The position I think they have taken is that if a page has not got the relevant text in either the title, anchor or headings then it is probably not relevant. In addition, if the phrase is in all these places, it may be the result of seo awareness. Experience has dictated that they can achieve good results by taking the middle ground. They then produce a set of results. Next, they apply hilltop and other factors to reshuffle these results, thus the initial poor ranking due to the filter may improve or continue to wallow low down.
Nothing is easy to prove, but I had a similar experience where I reduced the matching phrases throughout the page and improved my position. After florida, many sites dropped and then returned a few months later without making these types of changes, but I believe that this was hilltop which helped regain their positions. Google improved the hilltop filter and thus despite sites initially dropping because of too much hx/title/anchor match, they were regaining their rankings because of 'on theme' links in from authority sites and other factors.
In short, an hx/title/anchor match does you no favours. If you are strong on other factors you may still rank well, but sites that don't trigger this filter can rank well despite not having good links in and high relevant 'local rank'. Pure on page optimisation can still work wonders, but as with everything, there is no difinitive tactic that will guarantee you better rankings.
In the thread subject you mention a penalty, yet in your first comment you mention a filter. The first thing you should do is realize that there is a big difference between a penalty, a filter and general algorithmic weighting
Sorry, lazy writing! I meant to say filter.
The issue is title+hx match, on a competitive keyword phrase, along with possible anchor text exact match. Too much can be bad
This could be my problem. I have followed the formula:-
KW - once in title
- once in description
- once in <h1>, <h2>, <h3>
- once in first paragraph, last paragraph and throughout the body text
- once in url
- once in alt tags
- in inbound and outbound anchor text
The lesson is probably wind back the use of kw, get listed in the serps, then turn it back up slowly to get the serps position you want.
This being the case, where should I remove or reduce the kw from initially?
"I suck at SEO filter" = too much hx+title+anchor etc.
Yep, that sums me up. But with the help of the people here I'm trying to learn!
Do you have your kw in every heading, or do you mix things up a bit?i.e. <H1>Red Widgets <H2>Feeding red widgets <H2>Caring for red widgets
or
<H1>Red Widgets <H2>How to feed 'em <H3>Caring for them
Sorry for the delay - I can't get this site to load properly at the moment - it keeps hanging up on me.
If you take the Hs as outliners, then the recommendations for making a good overview of your page is to use a single <H1> and then any mix you like of H2, H3 and H4, except that you mustn't miss out a level.
So H2 H3 H4 is fine, but H2 H4 isn't.
H2 H2 is fine - it simply means that the first sub-heading doesn't have sub-headings of its own.
Your comment about using CSS instead of Hx is interesting, because I actually use both - I use CSS to scale down the changes in magnification as one goes up the headings, so my H2 is only a little larger than my H3.
I regard the H construct as being no more devious, and no more straightforward than letting me label sections of my page.
Sorry for the delay in replying.
DerekH
We've proved to our own satisfaction the existence of filters that filter out pages with too much 'optimization'. We've been able to move pages in and out of the SERP's by modifying elements. The thing is, there are many elements that, WRT to this discussion, are co-dependent.
So, while it may appear to some (who focus too much on one particular element, e.g., HX) that the element is problematic, it is in fact the whole picture that must be considered, and changes to any one of element may be enough to send a page to the depths, or bring it back again...which leads to all sorts of false conclusions.
The reason some get blasted for kw implementations in HX elements is likely that they've also got too many other elements pushing the SEO envelope at the same time.