Forum Moderators: open
Thanks
All hypothetically speaking of course.
nobody would use such nasty tricks.
although I am tempted to say the least... ;-)
I use a page title containing 2 or 3 keywords. Then a page heading with 2 or 3 keywords only. Coupled with a 2 or 3 keyword filename I think you have a good start on an optimised page with a consistent theme.
Surely it wouldn't be too hard for the Google people to build into the algorithm :
if number of words in <h1></h1> > number x then ... penalty y :).
I just use HTML for what it is meant for and sleep a lot easier.
An open question... What's the max number of <hx> words in your well ranked site tags?
Thanks, I love this site.
I could have used a more viable topic, but I think it proves my point. If Google's Algo started penalizing for to many words in a Header, they would loose a significant amount of viable information, and loose credibility.
If Google's Algo started penalizing for to many words in a Header, they would loose a significant amount of viable information, and loose credibility.
The word "penalizing" gives the wrong impression. You are unlikely to be dropped from the index for over use of the tag unless it is blatent spam. However there may well be an optimal range of number of words or similar. If your headers are outside of this range then you just may not get the ranking boost. Therefore you may see a ranking slip, but you haven't been penalised in anyway, you just haven't hit the best combination.
soapystar, I don't know if Googlebot would penalise for /robots.txt protected CSS files, but I imagine that a human reviewer would be unable to resist scrutinising such an arrangement.
By the way, I'm looking at bold and H1 in the new index and I still don't see an affect.
I think the H1 tags being discussed here are the ones that have been 'modified' to look like normal text or 1 point bigger, so the visitor won't notice
It sounds like that may apply to some, but another angle is that pages with genuinely long headers may need those headers shortening to perform better (even if the discussion is now irrelevant from a Google optimising perspective).
The above is obviously based on the assumption that length of headers is a consideration in SE algorithms, which it may or may not be.
Is it OK to hyperlink H1 tags or does that perplex Googlebot?
Can't give a definitive answer but if I look at it from a user perspective...
It would seem unusual to have your main page header as a hyperlink as surely the objective of an <h1> is to tell the reader what the page subject is.
However, I have used hyperlinked sub headings (<h2> etc) in index navigation e.g look at the front page of webmasterworld. The link to "Private Forums" could have been rendered as an H2 if Brett had wanted to do that.
As usual, if it works for your users, it will almost certainly work for Google.
I am locking it out because I'm afraid some stupid SE could index 'em, although I've never really seen this.
But basically: as CSS and Javascript files are linked in the HTML, bots might want to check 'em out. But there's nothing to index for them, so I exclude those files to save them from unnecessary crawling and save bandwith. I'm doing this ever since (years now) and faced no problem at all with any SE.
A good point I have raised elsewhere Dr Oliver.
Penalising sites for legitimate activities that CAN be abused is unworkable and can only damage Google's index. Spider do not need my stylesheet because they cannot see, surely this is obvious?
Then I looked at the robots.txt they allow looking at the css, and it says he got the robots.txt from searchengineworld.com....He/She is one of us...
At least now I know why my site got bumped to number 2/3 in the serps.
I haven't kept up too much on this thread, but I have read theories that go like this:
1 - You should think of the header tags and p tags as 'ranking' your text within your site. IE H1 text is more important than H2 etc.
2 - Too many words contained within a tag can dilute it's effect.
If your pages are otherwise 'identical', the difference between you and your competition is that maybe your competitor is better at getting his/her keywords into 'important' places in their code.
Just a thought.
this is true, but I have seen increasing numbers of sites doing well with techniques like this in the SERPs. Although you have to keep in mind that people using CSS trickery probably have some other things going on too...
Yes, some of us have validated markup. Google likes that too. Before using good markup and css, our company site was up around 35kb of html markup (proir to going live). Using css and proper markup (whichever *ml you choose) we're down to about 5k.
We got really lucky with our listing. At first crack we showed up #1 for our words on google. Luck was involved, but it should be noted that we are in the web design business, and we outranked other SEO & design companies on the first try.
Things are now bouncing around in the Everflux (expecially with today's little tremor) but what I learned from our luck was consistency, not tricks.