Forum Moderators: open

Message Too Old, No Replies

Valid Coding/Sound Design

Does Google still reward this stuff?

         

willybfriendly

6:16 pm on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Like many, I suppose, I have been trying to determine some of what goes into the new algo.

In the past Google has stated that keys to good rankings were spider friendly pages with good, valid code, and lots of content. Of course, we knew that it was also important to have a decent PR and relevant anchor text in the backlinks.

What I am seeing when I poke around is a lot of high ranking pages with exceedingly poor markup. For example, one search term returns 9 of the top ten with no <Hx> at all, while the 10th (ranked #7) has 4 <h1> and 1 <h4>.

w3c recommendations state "<h1> is the HTML tag for the first-level heading of a document. The title is generally duplicated in an <h1> element towards the top of the page. Unlike the title, this element can include links, emphasis and other HTML phrase elements. Often, webmasters will use an <h2> element instead, to make the heading smaller. This is incorrect, Cascading Style Sheets should be used to create this effect..."

So, a question to other seekers...are you all seeing similar things in regards to valid and correct markup? How many top ranking pages on one and two word terms have the tile duplicated in an <h1>? Or, an hx tag at all for that matter?

And what about content? 3 of those top ten results I refer to have zilch, nada, zero meaningful content. 2 are splash pages, one is a "We're out of business" page. These 3 are there only because of backlinks.

On another, unrelated search, 14 out of the top twenty returns were directories or links pages. Again, no content.

Is this the kind of thing that others are seeing in their niches.?

(Mods, this is not meant to start another round of Google bashing. Serious questions meant as part of a serious examination of what now ranks.)

WBF

Nicola

6:38 pm on Nov 25, 2003 (gmt 0)

10+ Year Member



As Google stands now I would not waste time analysing it.

vbjaeger

6:53 pm on Nov 25, 2003 (gmt 0)

10+ Year Member



Yes. For one of our main keyword phrases, Google returning the GNU prject as the #1 site. The search terms have absolutely nothing to do with what is being searched for. Also in the same search, there are only 3 surviving websites that offer a product related to the search term. None have H1 tags. The rest are directory sites.

caveman

7:44 pm on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I am seeing when I poke around is a lot of high ranking pages with exceedingly poor markup. For example, one search term returns 9 of the top ten with no <Hx> at all, while the 10th (ranked #7) has 4 <h1> and 1 <h4>.

Nothing new here really...following w3c has for a long time not been the way to achieve top rankings in competitive SERP's, unfortunately. Finding the line *not to cross* has been the way, at least in the competitive categories.

Looks to me that with Florida, G knocked out a bunch of 'optimizers', a bunch of 'spammers', and a bunch of true innocents. The hard core and sophisticated spammers however, mainly remain, with or without good code. Along with a bunch of very large and/or very conservatively SEO'd sites.

Again, good intent, poor execution, lousy results.

willybfriendly

8:21 pm on Nov 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nothing new here really...following w3c has for a long time not been the way to achieve top rankings in competitive SERP's, unfortunately. Finding the line *not to cross* has been the way, at least in the competitive categories.

It may be that w3c recommendations have not been the way to the top for competitive terms, but that is different than a penalty for following the recommendations.

In its webmaster guidelines, Google states, "Make a site with a clear hierarchy and text links."

This is consistant with wc3 recommendations, but the current SERPs that I am looking at suggest that this basic rule has changed.

Are others seeing the same?

WBF

markis00

3:44 am on Nov 26, 2003 (gmt 0)

10+ Year Member



Yes, google still rewards this, but it seems they have introduced a new spam filter in the Update Florida that is kicking many clean HTML home pages from it's index.

I am finding that my sub-pages are still appearing for keywords in the SERPS, but my homepage isn't. Also, it seems people with too much optimizaion (anchor text, H1 tag with keyword, filename using keyword tag, keyword density) are now being penalized by this new spam filter.

I conducted an experiement a few days ago. I removed my entire keyword density (my keyword is now only found once or twice in my content), I changed all my h1 tags to h3 tags, and I also made sure that my filename had the keyword in it, but used no sub directory keyword hieracy (/keyword/keyword.asp).

It seems people using clean optimization methods have been kicked out of the index. It may seem wierd, but de-optimizing is now actually working for some (including me) ;)

sit2510

6:09 am on Nov 26, 2003 (gmt 0)

10+ Year Member



You may have clean valid coding and sound design, but those do not give the most important factors for ranking. Often seen pages written by Frontpage with messy html code outrank the neatly ones.

aspdesigner

9:08 am on Nov 26, 2003 (gmt 0)

10+ Year Member



It appears like a misguided attempt to target white-hat SEOs that's killing a lot of innocent sites in the process.

Note that I'm not talking about spamming. I'm talking about things like good, relevant content and titles that are "too good" of a match to what someone was searching for.

In other words, things you or I would do to make a relevant, quality site.

Unfortunately, this is also killing many non-optimized sites, if they are "too good" of a match to the search.

So many of the most relevant sites (optimized or not) are getting dumped in favor of sites that are "not quite" what the user was looking for.

Take a look at your server logs, and try doing some searches. You may find your site now is coming up for searches that you are not a real good match for, but that your site has dropped for searches that are most relevant to your site!

Dave_Hawley

9:40 am on Nov 26, 2003 (gmt 0)



I don't think Google has ever cared about "valid code". So long as the page displays in a browser Google is happy and so is the visitor.

As to what Google uses for its alogo to rank sites....who knows. Many here will put their guesses forward as fact but they do not know. The recent shake-up in Google SERPs kind of proves they were all guessing (wrongly) as most are now not to be found :o(

Dave

Nicola

10:11 am on Nov 26, 2003 (gmt 0)

10+ Year Member



My site is dead in the water.

It's relevant, it's got good unique content, it's been around since 1999, the coding is valid and it's free to surfers.

It contains no tricks, smoke or mirrors and has been in the serps happily for 3+ years. I have one or two banners on it which pays the bandwidth bill and buys me shopping every week, then someone at Google throws in a filter and BANG.

Goodbye traffic, goodbye income.

This is not fair (I know life is not fair), but someone at Ranch Google should think about the far reaching consequences of their actions.

Roger_K

10:12 am on Nov 26, 2003 (gmt 0)

10+ Year Member



Is Google going to come back with old proper listings related keyword searches.

kaled

11:21 am on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hurricane Florida had zero impact on my site. However, there is no doubt that the quality of Google results has been falling for some time. Many people here seem to believe it is because sites are being penalized for excess optimization. If this is correct, there is basically only one solution - begin a campaign to shift users from Google to another search engine. The clear choice ATW.

I studied Cybernetics at university. One vital lesson was that non-linear components are bad. If Google are applying non-linear filters as people here believe then this marks a monumental backstep in design. If this is part of Google philosophy then results will not get better, they will get worse and worse as they try to fix one botch with another.

GoogleGuy, if you're listening, non-linear filters (if they exist) will be the death of Google.

Kaled.

PS
My specialization at uni was computer simulation - I know what I'm talking about.

Nicola

11:23 am on Nov 26, 2003 (gmt 0)

10+ Year Member



kaled :)

At last someone who knows what they're talking about! :)

Dave_Hawley

12:10 pm on Nov 26, 2003 (gmt 0)



Yet another conspiracy theory! Good grief. Everyone keeps saying the SERPs have got worse. However, I have yet to see one single post (now there's a challenge) that is comparing after Florida with pre Florida *without* using their own site. All I keep hearing is my site should be number 1 and it's not anymore.

If anyone can come up with some unbiased before and after results I may just listen.

Dave

kaled

12:41 pm on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Dave, do you keep snapshots of results that are not relevant to your site? I don't even save snapshots that are relevant to mine!

My site (abolutely squeaky clean) generally does better on other search engines. Most of the difference is due to spam and duplicates. In other words, for people looking for products on my site (software - therefore products are unique) Google is currently the worst major search engine.

Of course I would like to be #1 for all relevant searches but I can be objective. On ATW and INK, of the sites above mine, a much higher percentage deserve to be there than on Google right now.

As for the causes - well I don't have time to study it. However, if as people here believe, Google are using non-linear filters (i.e penalties for excess optimisation) then rest assured that results abolutely will not improve. No-one should be in any doubt about that.

I am being absolutely serious when I say webmasters should try to move their users over to another search engine. In the final analysis, that may be the only way to convince Google to move in another direction.

Kaled.

[edited by: kaled at 12:50 pm (utc) on Nov. 26, 2003]

Terrier

12:49 pm on Nov 26, 2003 (gmt 0)

10+ Year Member



Well Dave, I am certainly ready to own up to having pushed the envelope hard.

Whether or not my site should be in top ten is neither here nor there.

What is certain is that the results are not as good as they were.
In my industry the top site has remained the same and so it should. but from then on its well poor, However I have spoken with others this morning and it seem that some of the cream is starting to rise again, so fingers crossed.

But I do have to say I think G do have the right idea just have not got it right yet.

Most of us created sites that worked both with G and others, now G want something else, well I am quite sure we will be able to give it to them in time.

lgn1

12:56 pm on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Dave_Hawley if you are looking for an unbias result; here is one.

I never did quit my day job as a system analyst for a major company, and let others run my own company, so I hear plenty from the other side of the fence (user community).

I can tell you their is a lot of crumbling about people not being able to find, or taking a lot longer to find what they are looking for on google. Once I show them the keyword keyword -crap trick, to get relevant results, they are happy.

So it is not just us. This is effecting the entire google community.

Terrier

12:57 pm on Nov 26, 2003 (gmt 0)

10+ Year Member



Oh and as for coding, from what I see at the moment it does not seem to matter, but its important that we try to always make the code validate and keep it tidy. I find myself getting a real buzz at seeing smart well put together code.

Just Guessing

1:09 pm on Nov 26, 2003 (gmt 0)

10+ Year Member



It looks as though Broad Matching or Word Stemming has something to do with Florida.

Try the following:

1 Search for widget: same old results

2 Search for widget widget (or any 2 keywords): you will see widgets in bold as well as widget (ignore widget's - I think you always got that). You may also see widgeting in bold. At the moment you only see it in some entries.

3 Search for widget widget -ghghj: back to same old results with exact match only.

I think in many cases the results are worse with word stemming.

Google think so too: To provide the most accurate results, Google does not use "stemming" or support "wildcard" searches. In other words, Google searches for exactly the words that you enter in the search box. Searching for "book" or "book*" will not yield "books" or "bookstore". If in doubt, try both forms: "airline" and "airlines," for instance.

So why are they doing it?

Lots more has changed as well. I too am seeing many more directory pages, domain for sale pages just with backlinks, spammy duplicate content on sub-domains, etc. The results for many keyword combinations are fine, but for other keyword combinations they are much worse than before.

caveman

2:14 pm on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



willybfriendly,

Another post-Florida thread being hijacked...

Ummm, I didn't mean to be flip in my earlier post to your question. I took your opening post to ask: "Doesn't good code matter as far as getting good rankings?" A valid question since Google advocates valid structure/code.

My answer was meant to say, "No, I never believed that good code had much to do with ranking well." Ranking well has to do with having on your page(s) what the SE's happen to be scoring well that month, and it's a moving target, *obviously*.

However, in your follow up post, I took your question to be more: "Are sites getting penalized for using proper code?"

Sites are not getting penalized for using proper code. Sites are getting penalized if they display certain keyword patterns, whether the code is proper or not. In my view, at least, an important distinction.

Whether the keywords on a given site were placed in an effort to advance one's postion in the SERP's, or simply because they made sense, G can't know.

The point is: If innocents and/or serious webmasters who are only following accepted standards see their sites obliterated, then clearly G has gone ***way*** to far in their attempt to wipe out spam.

Honest webmasters, and even aggressive webmasters, who simply employ logical structure to their sites, should NOT have to worry that they will be targeted by G.

To put a face on it, it would seem that the following hypothetical site stands the risk of seeing its homepage *wiped out*:

Domain: Yellow-Widgets.com
Title: "Yellow Widgets, by SteveB"
Meta Keywords: "yellow widgets,discount yellow widgets,quality yellow widgets"
H1: "SteveB's Yellow Widget Fest"
KW Density: yellow widgets - 14.5%

That's sort of absurd, IMHO.

willybfriendly

5:00 pm on Nov 26, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To put a face on it, it would seem that the following hypothetical site stands the risk of seeing its homepage *wiped out*:

Domain: Yellow-Widgets.com
Title: "Yellow Widgets, by SteveB"
Meta Keywords: "yellow widgets,discount yellow widgets,quality yellow widgets"
H1: "SteveB's Yellow Widget Fest"
KW Density: yellow widgets - 14.5%

That's sort of absurd, IMHO.

Don't forget to add to this a menu that includes links to

How to Use Widgets
The History of WIdgets
Satisfied Widget Users
How to Maintain Your Widget
Widgets for Sale
Widget Reviews
Etc.

Obviously, this site doesn't have anything to do with widgets and should be banished from the SERPs :)

More to the point though, if a person is using "logical structure to their sites," it is absurd that Google would in any way penalize them. Yet, that is what seems to be happening from what I can see.

This smacks of the same type of arrogance that M$oft displays. With 70-90% of the search market, Google's decisions on what to reward or penalize in site design carry a huge amount of weight.

WBF

Nicola

4:36 pm on Nov 27, 2003 (gmt 0)

10+ Year Member



I now think SEO as it was will still be rewarded. Good coding is just the cherry on the cake, I don't think Google looks at much beyond text.

I am convinced that FILTERS are targeting keywords which have the potential to promote Adword accounts.

mbauser2

5:20 am on Nov 29, 2003 (gmt 0)

10+ Year Member



In the past Google has stated that keys to good rankings were spider friendly pages with good, valid code, and lots of content. Of course, we knew that it was also important to have a decent PR and relevant anchor text in the backlinks.

What I am seeing when I poke around is a lot of high ranking pages with exceedingly poor markup. For example, one search term returns 9 of the top ten with no <Hx> at all, while the 10th (ranked #7) has 4 <h1> and 1 <h4>.

The HTML DTDs (except ISO HTML) do not set limits on how often an H# element can be used, or what order they need to be used in. That page may very well be DTD-valid. (In fact, if it's an HTML Transistional document, the only required element is TITLE.)

Either way, what we have here is a failure to communicate. See, the truth is, Google never cared whether your code was valid. Trust me, I'm one the HTML facists who's been using validated code since 1995. It doesn't make a difference.

Search engines want to see easily-parsed code. That's not the same as validated code. Parsible code uses all it's end tags. It uses EM and STRONG and DFN. It avoids duct-tape HTML hacks like puting DT inside P to indent paragraphs. It means not using four layers of nested tables to control the exact distance between letters. And other crap like that.

Google recommended valid code, because it was easier than saying everything I just said. Simple as that.

Powdork

10:32 am on Nov 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Don't forget to add to this a menu that includes links to
How to Use Widgets
The History of WIdgets
Satisfied Widget Users
How to Maintain Your Widget
Widgets for Sale
Widget Reviews
Etc.
Personally, I think imagebot is involved in the conspiracy too. I just removed all my images and replaced them with flower and bunny gifs from microsoft's digital gallery. I downloaded a randomizer to create my alt text, just to be sure.;)
Here's what to do. Pick a new one word title (I suggest -asdf). Get rid of all instances of your target keyword phrase on the page. Now buy content related to this week's zeitgeist and put your keywords on your page ONCE IN AN OUTBOUND LINK and you'll be riding high.

Just Guessing

11:00 am on Nov 29, 2003 (gmt 0)

10+ Year Member



> I am convinced that FILTERS are targeting keywords which have the potential to promote Adword accounts.

See the statistical evidence in thread [webmasterworld.com ]

victor

11:57 am on Nov 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Deploying unvalidated code is a business risk.

If there is malformed HTML in the code, you don't know what effect it will have. Some examples:

  • an unclosed quoted string in a tag may cause a spider to discard the rest of the page (Googlebot has been reported as doing this) -- even though browsers may recover and display the page properly
  • For all we know, googlebot is cherrypicking -- the easy-to-parse pages are handled immediately, ones with errors are passed to a queue for a slower and more thorough check; that may result in days' delay in being indexed. (I don't know if that is the case, but it's the way I'd design such a system. The whole design is faster and more resilient that way.)
  • badly formed links may mean that spiders do not follow them -- leaving parts of your site untouched.

    My conclusion is that if I am going to deliberately deploy mal-formed HTML, it is 100% my responsibility to ensure to my business partners that I can guarantee that the mal-formed HTML will have no negative effects.

    That's harder to do than validation, and validation has other benefits too.

    If your web-designer is deploying mal-formed code, you'd be well-advised to get a guarantee of its effects from them. Ot change designers.

  •