Forum Moderators: open
wow....the table says it all.
Google better not be determining ranking with code validation [webmasterworld.com] after looking at that ;)
Fair enough SE's should have ethics, but hypocrisy will never be a good line to spin :)
nothing. its very easy.
but browsers aren't compliant with w3c standards. that means if we want certain features, we have to either ignore w3c standards or write bucketloads of code.
and why are w3c so picky with their standards?
why aren't they consistent in their standards?
although i write non-compliant code, it works and it displays as intended. it's not like it's failing to close tags or closing tags in the wrong order. skipping w3c compliance is not always sloppyness on the part of the programmer, it's sometimes intentional and does no harm at all.
why standards?
have a look at: [webstandards.org...]
standards are important in any field, and the w3c do a great job in maintaining standards for the web.
The problem with Wasp is that they're a clueless johhny-come-latelys. Zeldman frequently mentions accessibility as an advantage of xhtml/css when in actual fact xhtml/css can be less accessible than transitional html. His advocacy of using fixed font sizes has imho caused a rash of "we validate" sites to assume that they are accessible because they validate.
And as for saying go xhtml -> db - this is just ridiculous. If you're gonna need a database of content load the database then generate the html/xhtml/whatever. Once you've got your content in a database you can regenerate all the pages in a flash.
Wasp go on to say that not following standards will restrict your audience. This is hokum. And remember this is the organisation that previously suggested locking folks with older browsers out of web sites in an attempt to get ppl to upgrade...
i want the x/html & css i produce to be viewable in all browsers. i want to code to a standard, and have all browsers view the page as it should be viewed.
have you tried to produce a page with no formatting in the actual html or xhtml page, and leaving it all to css? my god, it's so easy to maintain and output from a database. but, it's also nigh on impossible to get a consistent output from all browsers.
why is this? because they don't follow standards.
i don't follow standards when i'm coding client's websites. i know my clients would much prefer their website to be viewable rather than for it to be coded to a standard.
webstandards.org are just trying to make these two aims become one - we code to standards and then our pages will *automatically* be viewable by browsers.
So what has happened?
The internal forums often have suggestions for various software improvements and higher editors and staff do follow up good suggestions as far as I have seen. As an editor in the 'standards' tree I couldn't sit by and let the current situation continue without dropping a note to someone. I expect someone else is already working on codings further down the dmoz directory tree, but there are other issues (especially with Charset declarations and so on) to consider, as the Directory contains many World Sites, not just English / Western world ones.
I joined THIS forum just to post the above reply, and I will be watching it from time to time. However, I edit right at the bottom of the tree in DMOZ so do not have any influence above the voices of the other 11 000 or so editors [No commas for thousands see ISO 31], so don't expect me, or anyone else to work any miracles. I cannot tell you anything discussed in the Internal editor forums either.
Ref: Item #11 - I *DO* look at the HTML coding of the web sites that I review in the Categories that I edit. I also look at the links. If there is a site with problems I do email the owner and suggest fixes and updates *under* *no* *obligation* . Most owners have been glad of the feedback, and just about all have used the majority of the suggestions where possible. There is no *requirement* for ODP editors to look at validation, but my view in editing at ODP is that I can help build a better directory, and help submitters build better web sites. A win-win situation. My advice is FREE.
[edited by: g1smd at 8:42 pm (utc) on July 3, 2002]
Congratulations DMOZ!
Welcome to Webmaster World g1smd!
I've just been reading the requirements (very extensive and comprehensive) set for U.K. government websites; I downloaded the entire set in .pdf format. Validation using the W3C validator is a must! So is CSS! Web Standards and WAI... the U.K. requirements are stringent and admirable. I predict we will see an even greater movement towards adoption of WC3 standards, CSS and the WAI in the months to come.
The DMOZ picked up the gaunlet... now let's see who's next! Hey, where's Google Guy?
[edited by: papabaer at 9:28 pm (utc) on July 3, 2002]
YUK. The error file is 700 K !!
[edited by: Brett_Tabke at 9:49 am (utc) on July 4, 2002]
[edit reason] shortened long url [/edit]
Someone needs to send them a link to this thread.
P.S. Let me run that page through FrontPage, I'll clean it right up. We'll validate either HTML or XHTML in less than 15 minutes. ;)
I have looked at most of my competitors and they do not either.
I agree that standards are important but think not many people now about all this, and well if there are a few errors here and there is it really that important if the page views as it should ?
I was reading through the thread again and realized that g1smd made an edit and I found the above to be a very positive and enlightening statement.
We've taken the same approach with reviewing sites for our directory. That was one of the first areas that was discussed for our guidelines...
11. HTML/XHTML Structure
We are reviewing the html/xhtml of all websites. W3C icons or other authoritative industry icons indicating that you've validated your html, xhtml, and/or css are a plus, but not a requirement.
P.S. I think we will end up changing that to being mandatory in the next few months.
Just kidding
I don't think 10 Downing Street have read their own papers...
Shame on them! Especially with such a comprehensive set of guidelines:
[e-envoy.gov.uk...]
Too bad the the site that hosts the guidelines doesn't validate....
Visit Thailand, maybe you have found a way to move a step up on your competitors...
As we noted, the other errors down in the cats look to be difficult encoding/entity type errors. Those can be real bears to fix. I'm up against the charset choice here now myself and not real sure of the direction to go.
Brett, encoding/entity & (&) charset issues do seem to present difficult choices in seeking validation. SOunds like a topic for a new thread... Care to elaborate on the difficulties you are facing?
Hmmm, I didn't realize this was such a big problem. The day mbauser posted a reply in response to a <noscript> validation error we picked up where the ampersands were not escaped, I called my programmer and he fixed it in about 30 seconds. I guess it goes much deeper than what we encountered.
Thanks also for discussions I have seen somewhere on this site about converting spaces in web addresses to %20 and in query strings to + . All very useful info.
I must make a comment about ODP editing and contacting submitters. I try to give submitters pointers about their sites. However, I edit in an area where many pages are written by private individuals to publicise information. These are usually not business sites, and make no gain from their web sites. In other areas of ODP, there are submitters who send multiple submissions, junk listings, and multiple URL sites. No one at ODP would ever consider contacting such submitters. I only help people who are 'one-man' band web writers, not corporates who have their own department of people paid to be doing that job. You won't find many people in ODP that communicate with the 'outside world' because there is no requirement for editors to do so. Editors edit site submissions received, following the titling and description guidelines, that is all. No-one has any set workload, and can come and go as they please.
an impressive collection of research work, i'm glad to see that the ODP fairs so well. Makes me laugh about MSN, with all the stunts they pulled on the new rebuild, still don't render correct in opera, not that i care that much, i just check in to make sure the site ranks well, for all them moths that like the bright lights, and then slide back out.
As we noted, the other errors down in the cats look to be difficult encoding/entity type errors. Those can be real bears to fix.
The entity errors are simple to fix ... in theory. The problem is that the URL database contains a large number of incorrect URLs. Either the data must be corrected, or the software generating the HTML pages must sort out the problems.
Note that the number of errors reported for SERP pages may vary depending on the URLs found by the search criteria used.
An issue that's probably more common than the entity error is incorrect "escaping" of the URL characters (most common is the space character that should be escaped as %20 or +). Many browsers also autocorrect URLs with "\" rather than "/" in the path (a curse from Microsoft). As a result, directories may have URLs with syntax errors.
A related problem is that there are some spiders that cannot handle URLs with the errors descibed above. (Now, that's another research project.)
I did some searces on hotbot this morning and after every one or two pages of results I would get the error page directing me to check spelling etc. as no results were found. hit the back button then enter again and get a page of results....
Sorry hotbot, I don't really care where I am in your directory as no one could find anything anyway at that rate.
Ann
Getting Closer.... One error!!
[validator.w3.org...]
No errors.
[validator.w3.org...]
Needs more work.
People are working on this. It's not highest priority at the mo.
New pages are being regenerated. There are a lot less errors throughout the whole [dmoz.org...] site.
I can find a couple of errors higher up in the cat list (closer to root where there are @ cats), but I think that is incredible.
AFAIK, that is the first search engine to be in compliance with HTML standards since Yahoo was in 1996.
Nice Work DMOZ!