Forum Moderators: Robert Charlton & goodroi
The real question for someone inserting bugs is: Do you feel lucky? 99/100 bugs may be benign. But that other one may be the one to cause Google (or others) to ignore all or part of the page.
There was a research survey a while back and it showed that 99% of web sites are NOT valid....
I believe that every page should be valid anyway, to allow for current and future visitors to your site to see the same content and design.
As far as affecting the SERPs, IMO valid sites are goign to rank better, as sites that aren't valid, especially stuff like dirty word HTML throw in unnecessary tags that will only make the pages less relevant overall.
Sam.
A good example would be the phpbb software. If you want to submit an official addon or plugin, it must be coded according to their standards. This way, the learning curve is only learning the phpbb standard. So if any other phpbb programmer glances at your code, they should easily be able to read it.
Validation does the same thing. If your not using validation as a standard for writing your webpages, then what standard are you using to write your pages?
This is just an experiment. On all the pages except for the index page, I have removed almost all the keywords. All the alt descriptions...cut down my meta tags to bare minimum...
I have respectable backlinks...none paid..some reciprocal....many more than what is in front of me in the serps...
Let's see if I move up...
I have a dmoz regional listing
I have GREAT position in yahoo and msn....
I don't rank in the top 100 except for one search term..(three word)
I may give it all up for traffic from google...the "Holy Grail" of se listings...
(of course it is a client's site but what the hey...)
I feel that zero seo and bumbling web pages which do not validate (and some are template pages) rank better because google likes that...let's see how we do...
two months...
The only thing I haven't done is remove the doc type...since I use css, it would send my pages into quirks mode and break them...and I do not want that...
One more thing...the serps are generally between 1,900,000 and 2,900,000 for these search phrases so it might not proof for everybody...
wish me good luck?...
There is an overall quality aspect:
with validating pages, the author shows his professionalism and the fact that he both knows and cares about his stuff. It is simply a sign of quality. Since everybody can look at the html source, it should just be clean.
If I would ask someone to make a complex website for me, I would look at some reference pages first.
There are specs. There is best practice.
Would I ever buy a database driven application from someone who can't even get 20 lines of simple html written down without errors?
Probably not.
My own pages validate, and I have not seen any negative effects yet from which I could have drawn a conclusion that Google explicitly doesn't like that.
Regards,
R.