homepage Welcome to WebmasterWorld Guest from 54.196.136.119
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 76 message thread spans 3 pages: < < 76 ( 1 [2] 3 > >     
Does Google Remove Pages with H1 Tags?
bekyed




msg:202425
 6:32 pm on Nov 23, 2003 (gmt 0)

I think the algorithm is excluding heading 1 tags as none of the sites in my industry in the top 10 have them.
What are your views on this - and yes all our competitors had h1 tags before plummeting

What do you think?

Bek.

 

Small Website Guy




msg:202455
 4:31 pm on Nov 24, 2003 (gmt 0)

My latest theory is that Google is computing an "optimization total" for each page with respect to the search phrase being used, and if the page's optimization is too high it is excluded from the SERPs.

The H1 tag contributes to the page's "optimization total". So pages without the H1 are less likely to breach the optimization threshold.

Just my stab in the dark.

allanp73




msg:202456
 5:27 pm on Nov 24, 2003 (gmt 0)

Small_Website_Guy,

This is the same theory that I would subscribe to. However, the current threshold is really low. Possibly there could be a balancing factor like pr or good anchor text links from external sites.

caveman




msg:202457
 5:48 pm on Nov 24, 2003 (gmt 0)

Hypothesis 1. When a page is spidered if a two word phrase occurs in both the title and <h> tags then it is assessed for phrase stuffing and if above a threshold then it is dropped.

Hypothesis 2. Google is using a lookup dictionary of two word commercial phrases and is dropping sites over optimised for those two word phrases.

I can say with certainty that Hypothesis 1 above is *not* in effect. I know a site that is highly optimized two pairs of filtered/money keywords on the homepage, uses H1 tags for both, and has two sets of keyword pairs in both the title and the H1 tags. They are still on page one of SERP's for *both* keyword pairs. They are not obvisouly spammy, but they do use sneaky tricks that G seems unable to catch.

Hypothesis 2 is, IMHO, very likely. However, since I can find an exception to virtually every one-factor potential penalty, it would appear that a combination of factors is needed to trip the trap.

The thing I'm having trouble with is that I see at least two categories where only one site or no site remains from among the previous top 20, and many clean sites were wiped away, so whatever the penalty factors are, the bar is set pretty low for triggering them, me thinks.

Hissingsid




msg:202458
 6:47 pm on Nov 24, 2003 (gmt 0)

Caveman recently said,

I can say with certainty that Hypothesis 1 above is *not* in effect. I know a site that is highly optimized two pairs of filtered/money keywords on the homepage, uses H1 tags for both, and has two sets of keyword pairs in both the title and the H1 tags. They are still on page one of SERP's for *both* keyword pairs. They are not obvisouly spammy, but they do use sneaky tricks that G seems unable to catch.

Hi Caveman,

Just to test this a bit further to prove the null hypothesis. On the page that you mention are the keyword pairs definately "commercial"?

Also you say "not obviously spammy" but just how loaded is the rest of the page, first words, body text, image alts, urls, anchor text etc?

I know this may be a tad pedantic but, like you, I would like to discount some of these theories so that I can decide what to do next.

Best wishes

Sid

lgn1




msg:202459
 7:17 pm on Nov 24, 2003 (gmt 0)

Well what may have buried me was that the H1 tag was at the very bottom of the page. We generally put a slogan at the bottom of the page "which happens to include our keywords", and somehow it was put in as H1 as it matched the colour I was looking for, as my H1 style was defined in the CSS as a resonable size.

I have since defined the slogan more properly and gotten rid of the H1. This may have cost me a month of being out of the google serps for being lazy, because this was not an attempt at SEO.

Anyways I sent a spam report to google with the keyword "floridaquality", so Googleguy will see it.

Maybe google will ease up on its filter, because of it, maybe not.

caveman




msg:202460
 7:20 pm on Nov 24, 2003 (gmt 0)

Both word pairs definitely being addressed by the filter/algo whatever it is (using -asdfhkjlkshd -lskshdjd included in search as the test).

No gross repetition of keywords but high kw density; no hyphenated files; logical naming of alt tags and text links, etc.

Spam stuff is sophisticated.

caveman




msg:202461
 7:24 pm on Nov 24, 2003 (gmt 0)

P.S. I make no claims about misuse of H1's; only that the presence of H1's is not enough to trigger any filter, nor should it be...

And to put normal use of H1 tags into an algo filter related to penalties seems to be potentially penalizing use of H1's also, which is why I don't believe it for one second.

allanp73




msg:202462
 8:32 pm on Nov 24, 2003 (gmt 0)

I think I figured out the new rules:
phrase can appear once in title
and no more than two times in the body text.
If the phrase is broken up up by other words then it is okay. The phrase is only looked at when it is a mjor phrase.

kevinpate




msg:202463
 9:27 pm on Nov 24, 2003 (gmt 0)

> phrase can appear once in title
> and no more than two times in the body text.

I don't profess to have a more precise answer, but unless only certain words and phrases are the triggers, the above doesn't seem right to me. I'm aware of numerous unaffected pages that don't comport with the latter part of the above suggestion and each has at least part of the page title in h1 tags.

caveman




msg:202464
 9:41 pm on Nov 24, 2003 (gmt 0)

Site I alluded to in msg #33 above (page one of new SERP's for two competitive kw pairs), has one keyword pair *twice* in title.

Same keyword pair also has kw density of 12%, appearing 5+ times...

Wish it were that easy ;-)

KevinC




msg:202465
 9:46 pm on Nov 24, 2003 (gmt 0)

I wonder if they are just banning sites that use h1 tags using CSS. I could see that. That is obvious scamming of the system. There is no reason to call a css tag h1 unless you are trying to scam google.

Could be possible I can see my external CSS file is indexed by google. It reads something like this:

body { font-family: arial; } .headerText { font-family: arial; ...
body { font-family: arial; } .headerText { font-family: arial; font-size: 1.3em;
font-weight: bold; font-style: normal; color: #336699; } .secondaryHeaderText ...
www.mydomain.com/css-win-netscape.css.html - 2k - Cached - Similar pages

Although I'm not sure if this is the case or not - but definite proof that google is spidering external CSS files.

lgn1




msg:202466
 10:05 pm on Nov 24, 2003 (gmt 0)

Interesting on the external font file. I have plenty of <Hx> fonts defined in the external font file, and I have
been buried in the results.

However the top competitor for our category does not use
external style sheets but has the following in their
code <h1><font size="2">keyword1 keyword2</font></h1>

KevinC




msg:202467
 10:25 pm on Nov 24, 2003 (gmt 0)

I'm not so sure its the css that is causing the problems - but if google is going to look at external CSS files then surely they would look at something as simple as a font tag within an H1. Maybe thats a little proof against the CSS theory.

TheDave




msg:202468
 10:32 pm on Nov 24, 2003 (gmt 0)

No single factor, such as the use of H1, has caused everyone's pages to drop, imo. There is no logic to it, just a fine line between natural chaos and un-natural order that you dont want to cross

willardnesss




msg:202469
 10:46 pm on Nov 24, 2003 (gmt 0)

Hey Caveman...I'm jumping in here a bit late in the discussion (hope I'm not repeating old news), but a few tests/situations seem to point to backlinks being a factor in the penalizing.

Real Example:

Pre-Florida: bluewidgets.com was #1

After Florida: widgetworld.com/bluewidgets.html #1 and bluewidgets.com has vanished!

The interesting thing is these pages both have the EXACT same html. The widgetworld.com/bluewidgets.html is an ad that placed a copy of the index page of bluewidgets.com on their website. (the only diffence in html is the links on the copy index page are absolute [bluewidgets.com...] pointing towards bluewidgets.com).

The only real difference I see is that the ad copy index page only had 2-3 backlinks, and the bluewidgets.com index page had about 100 backlinks (some of which contained exact keywords in the anchortext). It seems backlink anchor test is being considered as potential penalty.

Both pages also had the same PR.

Hoe that made sense.

Roolio




msg:202470
 10:47 pm on Nov 24, 2003 (gmt 0)

It would be a bit strange if google is going to start penalizing/banning websites which uses CSS to modify the size of a Hx tag. Why? Because google is using the same technique. Just look for example at the FAQ-page of the google toolbar subdomain. Than you see the H1 header "Google Toolbar FAQ" is a bit too small for a real-size H1 header. And especially check out the google.css what is used with this page.

So will google ban you for this? I don't think so...

And using H1 tags don't seem to be a problem too...

KevinC




msg:202471
 11:28 pm on Nov 24, 2003 (gmt 0)

I agree - but at least with my keywords - there does seem to be a lack of H1 tags used in the top ranking pages - where as almost all top 10 pages used H1 tags before.

I wish it was this simple - but it seems to be more of a combination of things. The idea that anchor text could be penalizing a page is tuff to swallow - it would be too easy to sink your competitor if all you had to do was point a tonne of links his way.

DerekH




msg:202472
 11:31 pm on Nov 24, 2003 (gmt 0)

bekyed started this thread with
I think the algorithm is excluding heading 1 tags as none of the sites in my industry in the top 10 have them.
What are your views on this - and yes all our competitors had h1 tags before plummeting

Every page on three of my sites has an <H1> and each has done better since Florida. And when I originally put the <H1> (and some CSS to temper them) in, they rocketed up the SERPS.

Yes, I know generalisation is no use to man nor beast, but what I bring to this party is the information that my sites suggest that <H1> - one per page - are mighty fine. And indeed, the original intention of the headings (H1, H2, H3...) , was of course, to divide one's site content into logical sections, each properly labelled.

And that's not a daft thing to aim for....

DerekH

steveb




msg:202473
 11:34 pm on Nov 24, 2003 (gmt 0)

"Does Google Remove Pages with H1 Tags?"

No.

willardnesss




msg:202475
 12:16 am on Nov 25, 2003 (gmt 0)

I agree that the backlink anchor text penalty possibility is very scary...but I have seen some evidence that this could be happening.

If Google were attempting to filter out 'Link Farms' and guest book/message board backlink spam...they may very well be looking for fishy backlink anchor text.

If this is the case...I'm toast. I have over 200 extremely relevant backlinks (it is relevant for other sites to link to mine) using keywords as the anchor text pointing to my site...my site is about that keyword, and it is hard to think of another 'text' to use for a link to my site....my domain name does not make sense for anchortext in my case.

a_chameleon




msg:202476
 4:05 am on Nov 25, 2003 (gmt 0)

It would be a bit strange if google is going to start penalizing/banning websites which uses CSS to modify the size of a Hx tag.

If Google's doing this, and I suspect Google may be - I used a CSS declaration "modifiying" the H1 to fit the text layout of one of my sites, and saw the page slip immediately, rather recently.

The 'obviously spammy' theory I see in this thread makes a lot of sense, as does the post re: Google parsing the external CSS files to 'check' if there's something designating H1 to be something else.
Let's face it.. We're discussing the most innovative search engine known, and one that put Inktomi in the partking lot w/ little bother. Google 'visits' as a browser, not a crawler.. and can both afford and does use the best talent around.

Why would it be surprising to anyone if it didn't begin "double checking" to be sure that nothing's amisss,
and erring on the side of it's image as "cautious and suspicous", to maintain it's integrity..?

:-}

a_chameleon




msg:202477
 4:13 am on Nov 25, 2003 (gmt 0)

A quick aside to my post..

Google regularly visits several of my site's "image" directories.. Why? What's there to see, if there's no text to be found? And why is Google visiting my root directory, when there's no path sending it there..?

Methinks there's some sniffing going on, to wit;
"Let's us just check, just in case..."

Make sense?
.

steveb




msg:202478
 4:16 am on Nov 25, 2003 (gmt 0)

"Make sense?"

No.

plumsauce




msg:202479
 4:29 am on Nov 25, 2003 (gmt 0)



and can both afford and does use the best talent around.

no dispute with the first part,
the jury is out on the second part

Small Website Guy




msg:202480
 4:42 am on Nov 25, 2003 (gmt 0)

The primary reason I use H tags is because they said Google likes them.

I don't see a logical reason why <h1> is given more weight than <p class="whatever">, because the user sees the same thing either way.

Captain




msg:202481
 4:54 am on Nov 25, 2003 (gmt 0)

I don't think Google will ever penalize keywords in anchor text based on the number of inbound links if the links go one-way... if the link is recipricated with keyword anchor text that is another story.

Reciprical links don't mean as much anymore.

nippi




msg:202482
 5:17 am on Nov 25, 2003 (gmt 0)

I'm still voting for it all being an overoptimsation filter. Not specifically relevant to H1 or anything else

I reckon google has placed a value on use of keywords used in different ways on the page, and if total usage is deemed to high for those keywords then whammo, penalty.

Like this(Poitns shown are for demo only, I don't have a theory on value)

10 points for <title>
20 points for <h1>
10 points for <h2>
50 points for multiple <h1> tags,
100 points if <h1> and <h2> tags identical and multiple
5 points for anchor text
5 points for every 5% of keyword density, etc.
5 points for alt text.

If you go over 100 points then whammo, penalty for those keywords for that page.

Would explain why some sites ok with <h1> and some not.

caveman




msg:202483
 5:44 am on Nov 25, 2003 (gmt 0)

If G ever penalized H1 tags that were used properly, even as part of a multiple factor filter, they would be killing themselves with the webmaster community. No credibility left at all.

No way.

People using multiple H1 tags and stuff like that, that's different.

But sanctioned use of those tags being penalized? That would be a blunder too collasal to contemplate. Use H1 appropriately or not all all, just don't abuse and you'll be fine on that front.

Now, about that backlinks...

oodlum




msg:202484
 6:59 am on Nov 25, 2003 (gmt 0)

I wonder if they are just banning sites that use h1 tags using CSS. I could see that. That is obvious scamming of the system. Thre is no reason to call a css tag h1 unless you are trying to scam google.

I appreciate your thinking aloud (it all helps) but can't believe a senior member said that.

Cascading Style Sheets were created to separate form from function.

You use h1 for headings. This is valid W3C MarkUp and is good for accessibility (think of a blind person using a reader, which would say aloud "Heading: My Widgets Page". That is function.

That does not mean you have to accept the ugly huge default text. So, as George Abitbol said, you use css to make headings consistent with the overall style of your site. That is form.

Absolutely legitimate, but obviously open to abuse if a SE gives extra points for words inside the h1 tag.

I don't see a logical reason why <h1> is given more weight than <p class="whatever">, because the user sees the same thing either way.

No - the user sees either a paragraph <p> or a heading <h1, h2...>.

Google initially figured "hey - if it's in the heading, it must be what the page is about". Probably stopped that now though.

Sorry for the sermon. You must have triggered one of my filters.

Hissingsid




msg:202485
 9:46 am on Nov 25, 2003 (gmt 0)

I wonder if they are just banning sites that use h1 tags using CSS. I could see that. That is obvious scamming of the system. Thre is no reason to call a css tag h1 unless you are trying to scam google.

I appreciate your thinking aloud (it all helps) but can't believe a senior member said that.

Cascading Style Sheets were created to separate form from function.

Hi,

I have to say that I agree with both trains of thought here.

1. Everyone knows that Google gave more weight to <h> tags than was probably due and that this was an easy way to improve your position in SERPs. It is easy to alter styles so that <h> looks like <p> and lots of spammy sites probably do this, so if Google wanted to cut this out it would be a soft target as **part** of a filter to cut out spamming sites.

2. Google definately should not do this as it goes against everything that we are being encouraged to do in terms of producing well formed valid pages which separate design from content.

Having said that I think that I can rule out <h> tags amended by CSS as a trigger. I have an old index page which has been at #1 for a particular 2 word search term for a couple of years with the odd blip during Google updates. It is definately filtered because, if a search for -fufufu search term, it pops back in at #2. It was produced by a visual design package which has an HTML 3.2 plus CSS option which puts any styling within a tag. My <h1> and <h2> tags have no styling applied and there is no CSS in an external sheet or in the head of the page.

When I look at the source for the top 10 in SERPs for this two word search term I find that 8 have no <h> tags at all. I search for <h and only <html and <head have this code. The other two both have tags formed like this <h1 align="center"><font color="#FFFFFF">search term</font></h1>. In fact this form of in-tag amendment of the style of h tags caused the last major spam filter fiasco with Google just about two years ago which is why I have been so carefull to use straight forward <hx> tags.

I'm becomming increasingly convinced that Google has a "hit list" or "dictionary" of two word commercial search terms which is used as a filter. If your page features these in <title> and <hx> tags then the pages is analysed for search term stuffing. If this is the case then it makes it very difficult to decide how to tackle the problem.

Assuming that there is a "dictionary" of two word terms and you have a site devoted to selling that two word term and you are a leader in the market for that two word term and you have optimized your site for that two word term to try and stop jokers that have worthless gateways to other folk's affiliate scheme appearing before your in SERPs what can you do next?

The really annoying thing is this. The sites that Google is really trying to target with this filter/algo will, no doubt, be the first to work out how to reverse engineer their pages so that Google puts them on the first page of SERPs while legitimate sites flounder.

Good luck everyone.

Sid

Herenvardo




msg:202486
 9:48 am on Nov 25, 2003 (gmt 0)

I've a personal website that I SEO in my free time. By now, I have an inbound link (it's relatively new, and very targeted) and PR2. Searching for a main keyphrase, I get ranked in place 10th of about 11,400. Above me, there are some sites with PR 6, 7 and even 8. Below me, there are many sites with PR 4 and 5, very related with the search terms.
These are (as I guess) the keys of my success:

1. Correct use of Hx tags
I have a tittle tag very short in each page, like myname - links for the links page. At the beginning of each page, I have a H1 tag with a more descriptive tittle, like Links to related pages (this examples are not exactly, they are translations). After this, there are some H2 tags marking each section of the file and, when it is needed, subsections are marked with H3 or even H4. This lets both spiders and users to get a fast overview of each file before reading it.

2. Correct use of alternative tags
There is many people who believe that Frames are SEO's enemy, but it is not true. Google does not penalize pages for using frames: simply it is unable to find the files that will be shown in any frame. So, using a <noframes> tag and a link to a sitemap or index gets a framed site fully spidered by Google. Toghether with the <noframes>, I use and abuse of <noscript> and even alt for images!

3. Think for users, not for money
This might seem the most doubtfull point, but is the one that works better. Google is a service for the www's users, not for busines owners. Make a page useful for visitors, thinking in your visitors, and it will improve its rankings.

4. The links and the PR
Not much to say about this. Simply remember that is less important than a good page. Even if you get a #1 ranking, this only will make more people visit the page. If you are selling a product or service, you must convince the visitor in order for her/him to buy something.

With this, I hope have solved the question of the H1 tags (if somebody was still doubtful) and some other questions on the same topic.
It has been a long post about web design, and I'm not sure if it's completely on-topic, so with no more delay, I give my greetings to all of you.

Regards,
Herenvardö

This 76 message thread spans 3 pages: < < 76 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved