Forum Moderators: open

Message Too Old, No Replies

W3 Validation? I don't need no stinking validation!

If the website looks good, should I really care?

         

martinibuster

3:25 pm on May 14, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



We're all supposed to use CSS, because that's the current w3 standard, but the cool stuff blows up in older browsers.
Jakob Nielsen's site is usable(I Love Ya' Man, really), but it looks ugly.
The average surfer (Who loves ya' baby?), just wants the info and in general appreciates a cool interface.

MY QUESTION: If the web site works across browsers and platforms, does it REALLY matter if it doesn't validate?

knighty

3:34 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well first I would say how can you not use CSS??

Secondly I would say no it doesn't matter if it doesnt validate so long as it works , looks good on all browsers and is optimised for the search engines.

Some people argue that you should do it to "future proof" your site but anyone that is'nt going to tweak their site for the next ten years or so (thats how long it will take for validation to be important) worrys me.

Also most designers will have a css file and a template which controls the look of the entire site so even should you have to rewrite your code (again not likely anytime soon) it will not take long at all.

DaveN

3:42 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



why wouldn't you vaildate it? if you take the time and effort to get it to work across all browsers (excluding flash) why not?

DaveN

Knightly we most sort out that beer.

martinibuster

3:42 pm on May 14, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I DO use css, but only to format the lettering and links...

DaveN

3:44 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you can also have the cool link to

[validator.w3.org...]

what more can i say ;)

DaveN

victor

3:47 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If it works, it works.

But it is more likely to work if it validates. And people are unlikely to tell you the site doesn't work for them -- they'll just go elsewhere. So you need to be really sure that it does work. Validation helps a lot here.

Maybe 100% strict validation is overkill, but any validation error is a sign that you may not be fully platform-ready and future-proof. For example:

-- Is your site readily readable by Googlebot and all other spiders? (spiders are browsers too; and you get lonely if they can't read it).

-- How about automatic software that converts the site for WAP or iMODE or other non-PC/MAC platforms?

-- How well does it work for speech-to-text or other disability-enabling software?

-- How much of a scramble rewrite will you have to do when IE 7 pops up?

Of course, these things are to some extent site-specific. If you don't need to work in phone browsers; or you have no legal, ethical or commercial reasons to provide for the disabled, then those issues don't apply to your site.

madcat

4:18 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it does. My two cents...

*Any Internet enabled device can view your content using CSS and XHTML.

*Any person can view your content, just the design is lost -- but remember the majority of people can see your design. The lesser percentages are constantly fading.

*CSS/XHTML is the quickest distance between two points. View source on a table based layout -- then view source on any CSS layout; just not a smart way to work really.

Validation means that our information is accessible. It won't take ten years for this technology to become in the now either. When you start working with CSS, validating your pages and realizing the benefits of doing so you will have a VERY difficult time returning to tables-- This technology is refreshing IMHO.

papabaer

5:34 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Personally, I cannot think of a single reason NOT to seek validation (except laziness...) while I can think of MANY reasons for it.

Madcat nailed the essence.

Why not validate? It makes so much sense..!

Axacta

7:12 pm on May 14, 2002 (gmt 0)

10+ Year Member



papabaer,

I want to take this opportunity to thank you for convincing me of the benefits of validation, CSS and XHTML.

After reading many of your previous posts on this subject I redid my site utilizing CSS (some) and made sure it validated at W3C. I intend to eventually learn XHTML and impliment that as well.

Again, thanks.

pcguru333

7:44 pm on May 14, 2002 (gmt 0)

10+ Year Member



Validation is a sign of a good designer/programmer. It will help create a better web. The web has strayed from its origins. Strict XHTML/CSS will bring the web back from browser specific snafus and still allow for great design (which HTML was not intended for).

Ten years is a century for the Web. After all in the space of 12 yrs we have had HTML vers 1-4 and XHTML 1.0, A few Versions of CSS, a few versions of JavaScripting, Java, ActiveX, Flash, dozens of browsers with multiple versions, etc

VALIDATION is something that we should have been doing before XHTML. It shows professionalism. All the above mentioned reasons for validation are also important.

Tell me a good reason for not validating that supercedes all those mentioned that are pro-validating...

Crazy_Fool

8:17 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Validation is a sign of a good designer/programmer.

do shoppers check if pages validate before making a purchase? no.

shoppers are more likely to make a purchase from a site that looks damn good than from a site that looks crap. they simply don't care whether your pages validate.

a little bit of CSS can make a huge difference to the overall appearance and maintainability of a site. there is no need to use the really cool CSS stuff to make a site that looks damn good to the shopper.

but anyone wanting to use the really cool stuff should maybe use browser detection or cascadign style sheets so that the really cool stuff only displays on the browsers it works on and not so cool stuff displays at other times.

as for me, i don't bother with the really cool CSS, just some basics in tables based sites ...

AlbinoRhyno

9:16 pm on May 14, 2002 (gmt 0)

10+ Year Member



does it REALLY matter...?

Thinking like that is what has made the software industry as horrible as it is right now. Thinking list that is what makes programs unneccessarily bloated, unoptimized, and just good enough to make it until the next version.

Why should we as developers make sure our product is as close to perfect as possible? Pride? Giving the best possible product to a consumer? Feeling good about yourself knowing that you did the best job you can?

Filipe

9:36 pm on May 14, 2002 (gmt 0)

10+ Year Member



To sum up what others are saying, basically,

If you're not part of the solution, you're part of the problem.

It's an entirely viable course of action, and it will work for your purposes - but you'd just be contributing to the problem (and it is a problem).

in general appreciates a cool interface

They appreciate a cool interface, but they use a usable interface. "Cool" features have never been as important as useable features.

rcjordan

9:51 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hmmm... looks like the css-artsy types are ganging up on the html heathens.

>does it REALLY matter...?

It depends on what you want. If you're a traffic junkie like me, the answer can easily be "NO!" I haven't bothered on past projects. 'Course, I'm also the guy that tossed meta tags and opted for js site nav over 2 years ago.

fathom

10:33 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> css-artsy types and html heathens <<

so where does SEO's fit in!

pageoneresults

11:46 pm on May 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As quoted by the first edition of the PositionTech Update...

Every day people in the support department receive email from subscribers asking how they can improve the ranking of their Web pages. Although there is no simple answer, there seems to be two common themes for Web pages that due not rank well - either the Web pages lack content or there tends to be errors in the HTML syntax. Content is king when it comes to good search engine positioning, but error free HTML syntax is just as important.

Although we can't help you with your Web page content, PositionTech customers now have the ability to validate the syntax of their subscribed URLs in their account area. To use the syntax checker, login to your PositionTech account and click on the "URL Listings" link. The syntax checker is highlighted by a yellow "H" to the right of your subscribed URLs.

Its not critical that your Web pages be completely free of syntax errors to rank well in the search engines. Though it is important that tags which are used by search engines to rank your Web page are free of syntax errors. Important error free tags include: title, meta description and keyword tags, opening and closing tags around page content and tables that contain content.

We of course do not require validation for submission, but we hope this tool will help you get the most out of your Inktomi URL subscriptions.

rcjordan

12:12 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>so where does SEO's fit in!

I think the fundamentals of today's SEO have less to do with the mechanisms involved like html and css than ever before. Domain selection, titles, content, file names, keyword selection, site structure, number of pages, links, and themes carry the lion's share of the ranking criteria. That said, I see css as allowing yet another layer of seo technique to be added to the site. Some of it, like positioning content for the best spidering, should be quite powerful... but then again, so was the table trick.

All in all, I'm drawn to css (and ssi) mostly for future-proofing and easier site maintenance. The fact that it's an seo tool as well is a big bonus.

ergophobe

12:17 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




shoppers are more likely to make a purchase from a site that looks damn good

All other things being equal, yes, but all other things are never equal. A site that looks great is often slow, loaded down with idiotic things (like splash pages and Flash intros) that designers think "look damn good". Ultimately, validated code depending on XHTML and CSS will *typically* (not always) be faster and lighter than code achieving a similar effect with nested tables and improperly nested tags and other challenges to the browser.

I think research bears out that sales correspond not with a site "looking damn good" but with if "functioning damn well". In other words:

- light and fast
- clear navigation
- clear pricing (a large number of abandoned sales are because people can't figure out how much they are going to get charged for shipping)
- quick and simple checkout.

These issues have little to do with validation, but I think it would be a stretch to say that validation is an inhibitor. I do not think it's a stretch to say that browser-specific tweaks could be.

Tom

pageoneresults

12:18 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



> That said, I see css as allowing yet another layer of seo technique to be added to the site.

rcjordan, this should be a concern for those of us who have been working with CSS using SEO strategies. We (SEO's) put a little twist to the positioning and put the core content right after the <body> tag. There are those who are strictly designer types, no SEO, who use CSS, but haven't taken the step to position the html for indexing purposes and also for degrading gracefully when CSS is turned off.

There is another part of CSS that will probably pop up on the spam radar screen, and that is using negative positioning. I won't get into the technique, but its an area that could easily be abused. Man, everytime something good comes along, there are those who will find ways to exploit it!

brotherhood of LAN

12:27 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>Man, everytime something good comes along, there are those who will find ways to exploit it!

Exactly. CSS should be something we can do innocently without someone peering over our shoulder! On the other hand, the stated benefits of positioning/reducing code are obvious.

>Negative positioning

Thats quite an old one though. I read something along those lines at least a year ago placing gifs off the left hand side of the page and naturally, the site in question had something like alt=website design, designs, tutorials.....

rcjordan

12:46 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>put a little twist

And some of us (well, me anyway), are playing with a wild concoction of css, ssi, and the ol' table trick. It's cross-browser (N4=yes), liquid (with minimum widths on resize), and content right smack where the spider has to trip over it. It validates to boot!

>negative positioning

I've seen some so well done that no one is going to pick it up, not even an editor viewing source. BTW, how many of those seemingly innocent blog posts do you think are filled with that little exploit? My bet is plenty! <back on topic> But it validates.

pageoneresults

12:51 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



> Thats quite an old one though. I read something along those lines at least a year ago placing gifs off the left hand side of the page and naturally, the site in question had something like alt=website design, designs, tutorials.

That technique is definitely an old one. I'm talking about one where you can position an entire page of content off screen, no gifs, no nothing, just pure content. Maybe you could call it off page cloaking. ;)

brotherhood of LAN

1:01 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



an entire page? wow, thats pretty psychadelic....no holds barred front edge spamming ! :)

OT from that, I tried validating one of my pages. I have 2 banners at the top of my page (u know that hunger site?...and a rainforest site banner), and naturally, W3C states that they require an alt tag.

I dont want to 'pollute' my page with alts related to the banner, but I want it to validate. How does everyone in here approach this? I noticed that if you use just one space and keep the mouse fairly still, the hand pointer grows a sleeve, which is fine :) but what is the best way around the validation prob?

tedster

2:06 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> negative positioning

> I've seen some so well done that no one is going to pick it up, not even an editor viewing source.

Hmmm - how about viewing the page with NN4 and Javascript disabled? All CSS positioning vanishes without js in NN4. It's the main reason I abandoned that little exploit.

> banners and alt

You gotta go with a minimum alt="" if you want validation. If "purist" validation doesn't matter to you, you can skip the alt attributes. I doubt there's any spider that will trip over a page just because there's an image without an alt.

knighty

8:14 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>
Tell me a good reason for not validating that supercedes all those mentioned that are pro-validating...

1) the Internet is'nt going to crash cos you missed out an alt tag
2) its more important to tachieve a goal first - traffic, sales etc
3) if you're a good designer your code shouldnt use outdated tags and attributes anyway.
4) I have enough trouble making the site cross browser and cross platform without the extra hassle of meeting some standards which are not enforced by any browser.
5) browsers are not going to suddenly penalise pages that don't strictly adhere to standards (just imagine the millions of pages that would be lost)
6) I'm lazy ;)
7) your end user does not care

I'm not trying to say you shouldnt validate your pages, I think its great that you do but it is not a matter of life or death.

The most important thing to me is to create a site that is fully functional, that is easy to use and looks professional. it would be nice if it validates too but no one cares either way.

DaveN

8:37 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am working on a clients site at the moment which he's has ask specifically that his new site most be :

A - search engine friendly
B - validate to w3c standards

This is the clients 5 website now and it's the first time he has spec'd in Validation.

DaveN

Eric_Jarvis

10:41 am on May 15, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



there is a single overriding reason to work predominantly to the standards...none of us have a clue what browsers will be normal in 18 months time...IE replaced Netscape as the number one browser in a matter of months...it is quite possible for such a thing to happen again

I don't intend to have a load of clients yelling at me trying to find out why their sites no longer work when this next happens

creating a site that works right now is not sufficient in most cases

Brett_Tabke

10:12 am on May 16, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



This is mostly a round up from all the recent discussions we've had related to browser support and validation.

How do you determine when to "cut and run" on older browser support?

First, you've got to ask yourself, how many users you can afford to lose if you drop support for older browsers?

Each site is going to be different in that regard. The biggest hurdle will be figuring out how many of your site visitors are using browser X. Here is a guide to doing that:

A) get a quality log program that reads direct, raw http server logs as generated by Apache or other server.

You can not trust online image counters or javascript counters to accurately count your users. Most of the "counters" that I have used, have an error factor between 15-20% due to tech problems such as browser caching, proxy caching, filtering software, and non supported browser functions. Not all users will actually download the counter image or activate the javascript. Some will download it every page view. We've even seen counters that were written to run in only one browser. Those differences will lead to very slanted agent data.

A quality log program will allow you do look at the raw agent strings that are in the server logs. Products such as: Webtrends, FastStats, or the free Analog can all read raw logs and generate quality reports.

B) Filtering for the number of visitors using browser X:

It is very important to look at Unique visitors (an ip address) to the best of your ability. You shouldn't look at total page views, or the numbers of page views by any particular browser. You could have something on your site that doesn't look or work right in some browser, and those visitors may have a high abandonment rate. If that is the case, looking at page views, will skew the data in favor of one browser or another.

Most log analyzers will have filters that allow you to filter down to just one page view per ever visitor. You can then look at each of those visitors browser agent string.

Something to consider is that many 3rd party browsers are using stock agent strings that identify themselves as some variation of IE. Often, those browsers will replace a "cobranding" name in the agent string.

Which ones of the following are not Internet Explorer?

Mozilla/4.0 (compatible; MSIE 5.0; Windows 98; DigExt; MSNIA)
Mozilla/4.0 (compatible; MSIE 6.0; Windows 98; Rebrowse)
Mozilla/4.0 (compatible; MSIE 6.0; MSN 2.5; Windows 98; MSOCD)
Mozilla/4.0 (compatible; MSIE 5.0; Windows 2000) Opera 6.01 [en]

The "Rebrowse" one is actually a proxy server, and the last one is actually Opera browser. In the proxy case, we have no idea what browser the user is using.

That is why it is important to look at the raw strings, and not just what the logger tries to spit out for agent numbers. There are hundreds of variations on IE and Mozilla agent strings.

I mention it, because it is tempting to just use your log analyzers default setting to identify the browser and aggregate the data. One of the analyzers mentioned above, simply looks for "msie" in the string, and counts all those as Internet Explorer.

When you have a good idea of how many actual visitors are using browserX, then you can decide if dropping support it worth it.

Just consider, that if your site makes $50,000 a year and you lose 10% of your users, that could translate into $5000 out of pocket. When you put it in those terms, suddenly dropping support doesn't sound so grand a plan.

Honestly, I can't envision too many sites that can't be fully usable in almost all browsers. I can understand some leading edge stuff having problems, but as far as "information retrieval" from a site, you can at least put something on the visitors screen - even if it isn't your mona lisa site, it can still be usable.

The biggest hurdle to overcome in back comparability and off breed compatibility , is getting your site close to validating. Simple things like unclose font tags, and out of place image, or proprietary tags can play havoc in some browsers. If you can get the site very close to validating, then the rest is up to those browsers - many of which, you don't have access too.

Lastly, as many have mentioned, there is the search engine spider aspect. We have no idea just how good, or how bad search engines are at parsing html. We do know, that they base their tools on the same standards documents that the browser manufactures do. That means, if a page validates, there is a near 100% chance of that page not being rejected by a search engine due to html problems. We also know that historically, "creative html" with intentional errors is one of seo histories most long standing "tricks" for higher rankings. The best way for SE's to thwart that is by using tools that look expect close-to-valid html.

If you can write a page, that a search engine will find "tasty", then there is no reason that NN3, IE3, or even Lynx can't use too.