Forum Moderators: open

Message Too Old, No Replies

search engines and broken HTML

         

Crazy_Fool

9:15 pm on Jun 5, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



anyone know what effect broken HTML will have with search engine rankings? ie, will misplaced or missing tags cause some parts of the content to not be taken into account? could it case the search engine to drop a page? are there any other possible problems?

papabaer

12:12 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



CF, this is a good question, and one I am sure has many answers. There has been some talk of Google showing preference to clean, validating code. I honestly do not know the answer though I am very interested in finding out. It just may be that this is yet another reason to seek code validation.

brotherhood of LAN

12:20 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



In one instance in a thread last month, a SERP was showing some bad HTML, stuff that was placed after </html>. The SERP took up half the page

maybe in some cases really bad HTML can pull through, our sheer cunning plans :)

But I would be inclined to agree that validating HTML will be better for robots, since they are effectively stripped down browsers trying to interpret the page

pageoneresults

2:50 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Based on recent sites that I've developed that are now compliant with W3C standards, I'd have to say that it has a considerable impact. I've got sites that are in positions above the authoritative resources and I know why.

Not only is it the clean html code, but the CSS and Absolute Positioning are having a dramatic impact on initial results. Add in all the other areas that are optimized properly and viola, the winning formula has been achieved, to a degree much higher than before and less time consuming!

tedster

5:28 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Simple errors may have big consequences - especially things like quotes left open in an attribute. I've seen instances where the entire content that followed that open quote was just plain ignored.

pageoneresults

5:35 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've always been fond of Brett's Search Engine Spider Simulator. That has to be one of the best tools a designer could have at their disposal. Just enter the URL and let it rip.

If the site validates W3C, you can pretty much be assured that the spider will traverse through the content more efficiently than it would if there are errors. Some errors are bypassed because they are so common, others like the one tedster brings up cannot be bypassed and lead to severe consequences and possible months of headache trying to figure what the heck is going on.

papabaer

7:05 am on Jun 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And if all that is not reason to validate your code... then I don't know what is! ;)

Win/Win !!!