homepage Welcome to WebmasterWorld Guest from 54.163.72.86
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 76 message thread spans 3 pages: < < 76 ( 1 2 [3]     
Does Google Remove Pages with H1 Tags?
bekyed




msg:202425
 6:32 pm on Nov 23, 2003 (gmt 0)

I think the algorithm is excluding heading 1 tags as none of the sites in my industry in the top 10 have them.
What are your views on this - and yes all our competitors had h1 tags before plummeting

What do you think?

Bek.

 

Herenvardo




msg:202486
 9:48 am on Nov 25, 2003 (gmt 0)

I've a personal website that I SEO in my free time. By now, I have an inbound link (it's relatively new, and very targeted) and PR2. Searching for a main keyphrase, I get ranked in place 10th of about 11,400. Above me, there are some sites with PR 6, 7 and even 8. Below me, there are many sites with PR 4 and 5, very related with the search terms.
These are (as I guess) the keys of my success:

1. Correct use of Hx tags
I have a tittle tag very short in each page, like myname - links for the links page. At the beginning of each page, I have a H1 tag with a more descriptive tittle, like Links to related pages (this examples are not exactly, they are translations). After this, there are some H2 tags marking each section of the file and, when it is needed, subsections are marked with H3 or even H4. This lets both spiders and users to get a fast overview of each file before reading it.

2. Correct use of alternative tags
There is many people who believe that Frames are SEO's enemy, but it is not true. Google does not penalize pages for using frames: simply it is unable to find the files that will be shown in any frame. So, using a <noframes> tag and a link to a sitemap or index gets a framed site fully spidered by Google. Toghether with the <noframes>, I use and abuse of <noscript> and even alt for images!

3. Think for users, not for money
This might seem the most doubtfull point, but is the one that works better. Google is a service for the www's users, not for busines owners. Make a page useful for visitors, thinking in your visitors, and it will improve its rankings.

4. The links and the PR
Not much to say about this. Simply remember that is less important than a good page. Even if you get a #1 ranking, this only will make more people visit the page. If you are selling a product or service, you must convince the visitor in order for her/him to buy something.

With this, I hope have solved the question of the H1 tags (if somebody was still doubtful) and some other questions on the same topic.
It has been a long post about web design, and I'm not sure if it's completely on-topic, so with no more delay, I give my greetings to all of you.

Regards,
Herenvardö

Hissingsid




msg:202487
 10:17 am on Nov 25, 2003 (gmt 0)

I've a personal website that I SEO in my free time.

Hi,

I think it is very clear that the effect that we are all harping on about affects only commercial search terms which is one of the pieces of evidence for the existance of a "hit list" of two word phrases that are filtered for.

Whilst your post is no doubt correct it adds nothing to the debate about the effect we are seeing following the last update. Yours is a personal website and I'm assuming that the search term you refer to will therefore not be in the filter list and therefore will not have been affected. Try searching for this -fufufu search term If other sites move in to SERPs take alook at their source and try and figure out why they were previously dropped.

I have a site for "widget clubs". It is as much over optimized as one of my commercial sites but it is not affected at all.

I don't think we really need a lecture on how to produce a personal web page. I would like to see more analysis of two wrod terms that have been affected and what is being seen when the filter is switched on and of by using -fufufuf or some other -string in searches. Perhaps then we can narrow down what is and what is not now acceptable to Google.

I don't mean to have a go at you but this is to important for off message noise to deflect us from finding some real answers.

Best wishes

Sid

Hissingsid




msg:202488
 10:18 am on Nov 25, 2003 (gmt 0)

I've a personal website that I SEO in my free time.

Hi,

I think it is very clear that the effect that we are all harping on about affects only commercial search terms which is one of the pieces of evidence for the existance of a "hit list" of two word phrases that are filtered for.

Whilst your post is no doubt correct it adds nothing to the debate about the effect we are seeing following the last update. Yours is a personal website and I'm assuming that the search term you refer to will therefore not be in the filter list and therefore will not have been affected. Try searching for this -fufufu search term If other sites move in to SERPs take alook at their source and try and figure out why they were previously dropped.

I have a site for "widget clubs". It is as much over optimized as one of my commercial sites but it is not affected at all.

I don't think we really need a lecture on how to produce a personal web page. I would like to see more analysis of two wrod terms that have been affected and what is being seen when the filter is switched on and of by using -fufufuf or some other -string in searches. Perhaps then we can narrow down what is and what is not now acceptable to Google.

I don't mean to have a go at you but this is too important for off message noise to deflect us from finding some real answers.

Best wishes

Sid

swones




msg:202489
 10:24 am on Nov 25, 2003 (gmt 0)

>Toghether with the <noframes>, I use and abuse of <noscript> and even alt for images!

Are you saying that you DO endorse abusing the noframes and alt tags? Can you clarify.

Thanks.

Simon.

Dave_Hawley




msg:202490
 10:38 am on Nov 25, 2003 (gmt 0)

Whilst your post is no doubt correct it adds nothing to the debate about the effect we are seeing following the last update

Sid, I think it might be you who has the wrong thread. The subject of this one is "Does Google Remove Pages with H1 Tags?"

Dave

Marval




msg:202491
 10:48 am on Nov 25, 2003 (gmt 0)

Hissingsid - first off - you might have a go at trying to make pages the way that was recommended by Herenvardo - you might have better success in Google.

that said - I dont believe anyone has asked this question - I see a lot of people using this double neg filter checker - anyone happen to have checked to see if it worked before this update?

There are hundreds of factors that Google uses in ranking pages - and hundreds of ways each of us rank extrememly well for high dollar money terms - all within the basic guidelines of HTML and W3C and Googles WM guidelines, as there are many of us sitting on the sidelines watching all of this talk about dictionary terms and conspiracy plots etc. realizing that none of these can possibly be in effect - if they were we wouldnt be still ranking so high.
You might take a more basic look at your sites and your linking patterns, linking partners and your overall quality of each page that was affected, to determine if the small things like validation, good surfer experience etc are true for your pages.

When I see questions that are coming up in threads like why Google is botting images etc. its obvious that some need to do some more homework before they start posting conspiracy theories - especially those that have been in this for a short period of time and are calling themselves SEOs
Just my take

Hissingsid




msg:202492
 11:18 am on Nov 25, 2003 (gmt 0)

Sid, I think it might be you who has the wrong thread. The subject of this one is "Does Google Remove Pages with H1 Tags?"

Hi,

You can only answer this question if you look at other factors and if sites with <h> tags have been romoved from SERPs.

The answer is as I've said earlier, one of my sites has been ditched and it does have properly formed <h1><h2> tags. The new top ten in SERPs for the two word term in question now either do not have <h> tags at all or have badly formed ones.

My point is this it is true that sites with properly formed <h> tags have been dropped in favour of sites without them but we need to do a lot more analysis to establish what has changed to make such a dramatic difference in results.

My page is still as useful, readable and full of usefull content as it previously was. It is focused on a single niche market insurance product and therefore has by default to use the two words which generically describe the product and which have been hit by Google.

In summary <h> tags are part of the problem for pages that have been affected but I don't think that removing them is part of the solution.

Best wishes

Sid

Hissingsid




msg:202493
 11:43 am on Nov 25, 2003 (gmt 0)

Marval recently said

Hissingsid - first off - you might have a go at trying to make pages the way that was recommended by Herenvardo - you might have better success in Google.

Hi Marval,

My point is that this goes without saying and is such a given that there is no need to reiterate it here. We are talking about a specific event that has happenned. Sites which previously appeared at #1 in SERPs have been dropped.

So you explain to me what has changed to make such a dramatic change. Unless we try and rule out some of these hypotheses and test others how can we reach a conclusion. Something has definately changed. My pages that have been affected have not changed, Google has changed. If it had 100 rules before, it has 101 now and we need to find out what that 1 is and soon. This is very serious and it is a bit unhelpful for people who have not been affected to sit smuggly thinking well I'm alright what are you guys going on about.

It seems that folks who have not been affected just don't understand the nature and importance of the problem. If it were as easy as you say to just re-do our pages and then everything will be all right I would be a very happy man. But alas I don't think its that simple.

<h> tags are involved in some way, sites without <h> tags are achieving much higher results in SERPs some sites with <h> tags that are optimized are being dropped ***for the two word search terms that are affected***. The same page has a three word phrase in <h> tags and a similar level of density etc in the various parts of the html as the two word term but if I search for that term it is still #1.

I would be grateful if you and others could address that specific question or lets all agree that Google is not romoving pages with <h> tags but it is removing those with <h> tags which include particular phrases if other factors come in to play on the page.

Best wishes

Sid

HarryM




msg:202494
 12:10 pm on Nov 25, 2003 (gmt 0)

I can't imagine why this thread has gone on so long. The question has been answered many times - Google does not remove pages with H1 tags.

Why on earth would Google want to do this? Google needs to know what the page is about (in terms of keywords) just as much as a user. Use of keywords in TITLE and H1 tags, when reinforced by the same keywords appearing in the text, especially if in bold, makes this clear to both SEs and users.

If Google is an intelligent organisation - and I believe it is - it will have considered backwards compatability. The use of the H1 tag is enshrined in standards and there are a zillion webites out there that use it. Many pages are built on the exclusive use of Hx and P tags - not a SPAN in sight. It would be a disaster for Google if these traditional pages fell out of SERPS.

Neither is Google likely to be concerned about the use of CSS with Hx tags. It is a perfectly valid use of CSS. Why would Google want to annoy those designers who are concerned about the appearance of their sites?

What I do believe is that Google is concerned about is over optimization. I am sure I am not alone when I get annoyed at searching for some information on (say) a well known place name to the find the information sites buried under a mass of over-optimised sites trying to sell me hotel rooms and cheap flights. I believe Google is trying to restore the balance, and, when triggered by certain "commercial" keywords, is doing this by penalizing sites that are over-optimized.

The misuse of H1 tags by hiding them or stuffing them with an unnatural number of keywords would fall into that category, but not their use per se.

a_chameleon




msg:202495
 12:25 pm on Nov 25, 2003 (gmt 0)

When I see questions that are coming up in threads like why Google is botting images etc. its obvious that some need to do some more homework before they start posting conspiracy theories - especially those that have been in this for a short period of time and are calling themselves SEOs

I dunno about 'homework'.. I SEO a site that's been in the Top 10 SERP on it's four most relevant search terms since '98, and continues to survive update after update.

The very nature of the site requires that it's two (two word) commercial terms occur numerous times on almost all it's product pages, etc. as well as the FAQ's and such. I checked this AM, it's at #1 for two terms, #2 for the third and #4 on the fourth.

In fact on most pages, both two two-word commercial terms occur usually 15 - 18 times, from top to bottom.

It doesn't have any H1 tags anywhere, so I can't comment as to whether or not it's affected there, but I do know that looking at the keyword occurence I use, e.g. title tags for all links, alt tags, on and on.. it's literally overwhelemed with keyword occurences.

As to botting images.. couple of years ago I FTP'd a few trial copies of a new index page into it's pic's directory, by accident.. Lo and behold the log files told me Google had crawled them, and continued w/ get calls for months for those oddball pages. Curious, since they weren't linked to anywhere, obviously, I threw in a few more, and there was Google again. Hmmm..

It's also worth noting I've seen Google crawl many other trial pages sent into the root directory and left there from laziness; again there was no path to these pages whatsoever, so I kept that up as well. I also began putting what I called "spider pages", pages not linked to yet keyword laden, almost ridiculously so, in all the various directories.. and one of these "spider pages "is at #1 and another is at #3, crawled and cached last night.

Not really "conspiracy theory" material, but food for thought..

;-)

jady




msg:202496
 12:41 pm on Nov 25, 2003 (gmt 0)

I use H1 tags VERY sparingly. Never over-used them as it just LOOKS spammy. (If it looks spammy, it probably is!) I think more filters were applied for keyword stuffing and OVER usage of the H1, therefore I have returned text on one of our sites to a "pre-optimized" format for readers only and we will see what happens!

Just Guessing




msg:202497
 11:54 am on Nov 26, 2003 (gmt 0)

It's definitely NOT just keywords in H1 tags.

I have pages with keywords in H1 tags that have ben zapped, pages with H1 tags that have not been zapped, pages without any H1 tags that have been zapped, and pages without any H1 tags that have not been zapped. There is no measurable correlation with H1 tags for my pages.

I also have pages with very low keyword density that have been zapped.

I had a theory that it was too many internal links with keywords in anchor text, but I have just disproved that too.

I have one page that has been zapped with the keywords once in the Title, once in H1 tags and twice in 1200 words of text. The keywords are not used internal anchor text but they are in anchor text in links from other sites.

The same page has survived for another two word keyphrase that is used more in the page: Once in the title, once in H1 tags, once in H2 tags and 3 times in the 1200 words text. Again, the keywords are not used internal anchor text but they are in anchor text in links from other sites.

The page is certainly not over-optimised by anybody's description, but it has been zapped for the key phrase that it is less well optimised for.

I don't hold much with conspiracy theories, but it certainly looks as though Google is somehow selecting certain keywords for different treatment. In this case, there seems to be only one advertiser bidding the minimum for the keywords where this page has survived, but bidding is over $10 for the top Adwords position for the keywords for which the page has been zapped. That may be a coincidence, it looks like it might be some Broad Matching that is coming into play with these particular keywords - I am actually seeing plurals and variations of one of the keywords in bold in the SERPS, but only for some entries and not others.

What I can't figure out is what is different for some of the other web pages that have survived - there are some that would not benefit from the broad matching and that seem just about as highly optimsed as mine that has been zapped (i.e. as many as 4 mentions of the keywords!).

Namaste




msg:202498
 5:14 pm on Nov 26, 2003 (gmt 0)

No, there is no rule in Google that H1 was removed.

But yes, like many have observed, over seoed sites are affected....so if you are doing H1, then you should look at other factors

Herenvardo




msg:202499
 10:02 am on Nov 27, 2003 (gmt 0)

Are you saying that you DO endorse abusing the noframes and alt tags? Can you clarify.

Of course, here you have the clarification:
My main page's only content is a frameset tag with its frame tags. It shows a menu at the left and a tittle up that is always visible to the visitor, even when loading the internal pages. The problem is that this file has no links and, since google does not read frame tags, it can not spider my site. The solution is easy:
after the frameset, a <noframes> tag is put in the index.html file, that will not be shown by any browser able to manage frames. Inside this tag, an alternative code for users who do not have frames enabled (like gbot) is put. Something like this:


<frameset>
<frame name="tittle" source="tittle.html">
<frame name="menu" source="contents.html">
<frame name="main" source="main.html">
</frameset>
<noframes>
Your browser does not support frames or you have them
dissabled. If you wish, you can explore the site without
frames <a href="contents.html" target="_blank">
openning the menu in a new window</a>
</noframes>

By doing that, I have a index.html file that was sent to the ODP and is indexed in google with only one link, pointing to my menu file. Since the menu has links to all the pages in the site, google visits the menu and then spiders all the site with no trouble. I also put a link in each file to the homepage, with two objectives: the PR is accummulated in the main file and it serves all the other ones thru the frames and if somebody enters by an internal page, s/he can go to the home and have the menu available.
For all this reasons, I often use frames and I always use the noframes tag. A framepage without noframes tag is an exitless street in the web.
And with my images, it happens something similar: I have, for example, my logo on the top of the page. It is 3d text with some shadows, reflexes, metallic surfaces, etc. But google is not able to read its text, as everybody who has imgs disabled. So I put a ALT="MyPageTittle" in the <IMG tag and get everybody reading my tittle. I do that with all the IMGs i put on my site. As another example, I have a download section where every file has an icon representing the file type, I use something like:
<IMG src="zip.gif" alt=".ZIP"> and even google knos that there is some .zip in my site! :P

In conclusion: The firsts HTML versions had only a few tags: <a> <Hx> and a few more. All my pages are based on them and, everywhere another tag is placed, an alternative content, based only on these tags, is given.
Back to topic: I think it is impossible that google removes a page with an H1 tag. Even I'm sure that google likes pages that do a correct use of H tags (only one H1 per file, no more than a header line per paragraph, etc).

Greetings,
Herenvardö, the H1 defender ;):P

[edited by: Herenvardo at 10:23 am (utc) on Nov. 27, 2003]

Dave_Hawley




msg:202500
 10:17 am on Nov 27, 2003 (gmt 0)

google does not read frame tags

Is this still the case? I think Google and 95% of browsers can read frames.

Dave

Herenvardo




msg:202501
 10:29 am on Nov 27, 2003 (gmt 0)

I think Google and 95% of browsers can read frames.

Of course, I hope that Google is already able to read frames. But I'm not sure if it treats them in the same way that normal <a> links to pass the PageRank. Even so, if the 5% of the browsers are not able to read them, I prefer to make my site navigable for them.
Remember that the WWW was created to share information, and even when the technologies improve, I try to share the information that offers my site with everybody who wants to read it.
If I can make a site navigable to everybody, ¿why exclude that 5%?

Greetings,
Herenvardö

This 76 message thread spans 3 pages: < < 76 ( 1 2 [3]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved