Forum Moderators: open
[edited by: encyclo at 7:51 pm (utc) on Jan. 31, 2006]
[edit reason] no example sites please [/edit]
For me, validated code means it *should* work in all browsers, but we know that's a pipe dream. At least it gives you the best chance of that happening. The bottom line is how your page is rendered by the different browsers. If having an error in your code doesn't affect it, then don't lose sleep over it. If the validator spits out errors because your links use & instead of & then don't worry about it.
I start with a template for each of my web pages, and in the process of entering content and some HTML tags, I occasionally make a mistake. Often it's obvious and I fix it. Often it slips through and goes unnoticed. The page renders correctly in Explorer and Firefox, so I don't think to validate. The original template validates, but I don't run every page generated from it through. I probably should, but I probably should floss every day too.
When a page doesn't render correctly and it's not obvious, I run it through the validator, smack my hand against my forehead when I see the problem, and fix it.
is this just a bygone thing?
No, it's a thing that has never been, so far.
It's a credit to the web itself and browser developers that anything at all shows up on screen, given the general level of html on the web.
The idea of valid code has not been a concern of most developers (or even wysiwyg editors) to date. It's actually a concept that is just now on the cutting edge -- and in fact, most corporate developers I talk to have still never heard of the W3C or a DTD.
It's actually a concept that is just now on the cutting edge -- and in fact, most corporate developers I talk to have still never heard of the W3C or a DTD.
how important it is to validate your code and the search engines
Are you trying to open a can of worms or what? :)
Right... you don't want to even bother answering a post like this(no offense bwnbwn, just that this question seems to get asked here every month), but... daaaaah, you can't help it! If it converts one more person, it's worth the effort.
To add to the great points made by the previous posters, to me it's just "the right thing to do". Not only do you pollute the web by not designing with standards in mind(meaning that browser and web-based app developers have to clean up after YOU when they're developing software), but you perpetuate a very malicious mindset in designing systems, which is "just throw it together so it works... don't worry about how it's built". A web site is indeed an information system... when someone is incorrectly designing or bloating the code of a site, they run the risk of it coming back to bite them (or likely the next designer) in the arse when new needs arise for the site. It's like anything in life that gets built: if you build it to withstand any conceivable future scenario, it will be successful. If you build it just well enough that it looks like the blueprint... well, you'll just have to hope that no one ever asks too much of the site.
search for anything and test the site that #1 whether their code validates or not. In 98% of cases it will not be validated.
search for anything and test the site that #1 whether their code validates or not.
I just tried it on 10 words/phrases that are commercially important to me.
All the No 1 sites validated.
(They are all my sites.)
I don't have a death wish, so I have no compulsion to deliberately introduce bugs that may cause my site to be skipped or downgraded by important search engine spiders.
It may be that the big sites in other areas have been lucky. Or they may have spend megabucks R&Ding the deliberate bugs that are useful vs the ones that are not.
But all you can really say on doing your research sample is that certain bugs (the ones in the 98% of pages you have looked at) are benign.....You can't safely say that all deliberately inserted bugs are safe.
I bet that validation has NOTHING to do with your ranking, even though being validated is better.
Of course if there are glaring errors in your code, your page is not likely to render correctly in the browser, but there are a lot of ways a page can be "invalid" and spending the hours tracking down each error and fixing them could be better spent promoting your website or building content.
There are LOTS of invalid pages on the web (both from large corporations and smaller independent site owners) that rank just fine and get indexed perfectly.
There are LOTS of invalid pages on the web (both from large corporations and smaller independent site owners) that rank just fine and get indexed perfectly.
And this is why I don't worry about it - as long as my site is indexing properly, and looks good in the top 4 or 5 browsers, I'm not going to worry about it.
It's like asking about whether or not we should design for 1024x768 users or 800x600 users. Okay, it's not really like that, but it opens up just as big of a can of worms :-)
I bet that validation has NOTHING to do with your ranking
Actually, I agree to a certain extent. I don't think validation has any positive effect, but the lack of validation could potentially have a negative effect. You can download three dozen browsers and test your page in all of them, but Googlebot et al are user agents too, and important ones at that, and you can't test your pages with them.
When you have invalid markup, you just don't know how the bot will handle it. Some errors such as missing end quote marks can stop a following phrase of section of content to be skipped or ignored by the bot - which has serious consequences if that content contains your most important keywords.
Validation is not a panacea, but is a measure of cluefullness. If the markup validates, then if you are ranking badly you can at least discount the possibility that your markup is the cause of the problem.
I bet that validation has NOTHING to do with your ranking, even though being validated is better.
I wouldn't be so sure......I pretty convinced that validared code is a significant competitive advantage.
As encyclo says, my competitors may be having parts of their pages missed by the spiders as those parts are mis-read as comments or quoted strings.
Plus, why should I take extra time to insert deliberate bugs into my code when I do not know what effect those bugs will have?
I use tools I trust that generate clean code. Anyone using a sloppy tool that produces buggy code is taking a business risk: what other things is the tool doing wrong?
That risk may be small, and the competitive advantage may be small too. But small percentages add up: validated code, sensible linking structure, spell-checked content, accessiblity aids, cross-browser operation, etc: each alone may be worth only one position in the SERPs, but the whole set makes a difference.
I have been wondering how important it is to validate your code...
The W3C explanation [validator.w3.org] sufficiently answers this question for me, and I look at search engine indexing and placement as a separate issue. As is clearly expressed in this thread, even invalid code can make you a buck.
And yes my site comes up in search, but that doesn't matter if the site is not being shown properly.
As is clearly expressed in this thread, even invalid code can make you a buck.
I think it is more accurate to say:
So if you intend to deliberately insert bugs into your HTML, ensure they are the bugs that do not matter, not the ones that do.
Part of the problem is that there is no list of which bugs matter and which do not. So anyone inserting a bug is taking a risk with their livelihood.
It's a bit like saying: "some faults in a airplane engine still allow it to fly safely; therefore fixing any problem with an engine is a waste of time."
truezeta's problem is another issue ... your site has to be cross browser compatible.. you can do this and still have invalid mark up ... it also depends on what you mean by EVERYONE ... if you are talking blind, partially sighted or color blind... then you have a major problem if at present it is not even working cross browser...
check at [webxact.watchfire.com...]
> webxact.
that is plain dumb:
WebXACT is a free online service that lets you test single pages of web content for quality, accessibility, and privacy issues.Page URL:
Show Advanced / Accessibility Options ¦ Terms of use
WebXACT may not fully support your browser.
At this time WebXACT has been tested with the following browsers:
Internet Explorer (version 5.5 or later)
Netscape (version 6.01 or later)
Mozilla
that does not inspire confidence.
There is little reason not to validate, as I see it.
Another can of worms ;)
In my opinion, if validating was important, then webpage editors (wysiwyg editors)would make it so you couldn't do something to make it invalid. Why do html editors allow invalid code to be used in the first place?
Firstly, I believe HTML editors and WYSIWYG editors are two different things.
Secondly, WYSIWYG editors are a little like paint by numbers kits, it is better, in my opinion, to hand write the code, or at least check it by hand, if using a WYSIWYG. They are not an excuse for not understanding the basics of HTML, except perhaps in an amateur application.
My two centimos.... :)
Web pages are not brakes on your car or neurosurgery; they are documents. They can't hurt anybody.
If publishing to the Web required a test of validation at the door, there would be 500,000 pages on the web today.
Almost like a wave of evangelists ...
I know it seems important to you, it justifies your existance.. let's just say it is more to do with balance .. and focus .. the focus being on the selling/productivity/traffic etc .. code has it's part, but I think you will find that more people will go to a site that is more giving them what they want than a site that is well oiled but lonely ...
It's a balance .. it always is, content V's File size .. design V's technical ... information V's delivery etc etc ...
Complete validation is not the be all and end all.. IMO
If you generate valid code in the first place, you spend far less time debugging your pages to get them to render similarly from browser to browser.
It's the interoperability! And it only gets more important as different devices start being used to access webpages. I run one low traffic site in a not-at-all techy area and I've started to see handheld useragents in the stats.
This was a complete surprise to me, but because of the way the site is coded (without css, it works pretty well as a text document), it does reasonably well in some of the tiny, underpowered handheld devices--and this is true in spite of the fact that I've never tested for handheld browsers (except to the extent that I've tested it for Opera), and I have no idea about their various error-handling strategies or how, specifically, they cope with malformed code.
-b
As far as rankings - near as I can tell (out of a sample of maybe 20 sites so far) it hasn't done a darn thing with regards to Google, but it has seemed to help in Yahoo and MSN. YMMV.
And as of this morning, one of my own personal sites (which has AdSense) validates perfectly well - although as I mentioned in another post somewhere, I can't get it to display properly in Safari no matter what I do.
Not even the mighty google is valid html or xhtml, and i think google's doing pretty alright isnt it.
The mighty google doesn't care about a lot of things.
The mighty google has never spent a moment worrying about its SERPS ranking in google or any other search engine.
The mighty google has never spent a moment trying to get links from other websites.....Just the opposite in fact: it spends all day giving links to other websites.
The mighty google has never spent a penny on buying adwords or equivalent programs with other search engines.
The mighty google is not a good example in this instance, I think.