Forum Moderators: open

Message Too Old, No Replies

Validating code vs. Code that works

That argument again....

         

joshie76

11:39 am on Dec 13, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In another thread [webmasterworld.com] the old validating code discussion got started (again) which got me thinking :o.

Personally I have never worried too much about writing 100% validating code as long as my code is clean, and works in as many browsers as necessary (for internet work as opposed to web-applications). Things like the leftmargin and marginheight attributes don't validate but they're a neater solution for the majority of browsers than trying to play with CSS etc.

The argument for validating code has always been that SE spiders would rather eat validating code. Whilst this is, at least, speculative is it not reasonable to consider that, since the SEs are providing a service to searchers, they may have (indeed are likely to have) coded their spiders to gobble code that is commonly used by the majority of sites out there? Things like leftmargin and topmargin. For all we know an empty alt attribute in an image could upset a spider. But validation says this is the preferred option over having no alt tag.

I accept there is no real way to test this but I did try the next best thing. I had a look at the really popular search trends (google zeitgeist etc) and went searching on some of the big engines. Not one, get this, not one that I tried returned a top result that was a validating page.

Not the most controlled of experiments but an interesting point for discussion. Anyway I'm off to remove my details from my profile in case Xoc tries to hunt me down.

Eric_Jarvis

12:32 pm on Dec 13, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



actually you miss the point of having valid mark up

firstly, being able to run the page through a validator will pick up coding typos more thoroughly than the human eye...so I always run templates through a validator

secondly...the reason to code to standards rather than to a browser is that things can change very fast...especially since it is very likely that there will soon be an explosion of use of mobile devices to access the web...I don't design for now...I design for now AND the near future

tedster

12:38 pm on Dec 13, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> ...they may have (indeed are likely to have) coded their spiders to gobble code that is commonly used by the majority of sites out there

I'm certain you're right about this, joshie. And spiders have the potential to be VERY forgiving of bad code -- because they don't need to render the page on a monitor, but only need to strip it down to what the algo needs for crunching purposes.

So elements like leftmargin or proprietary tags most likely don't matter, even though, strictly speaking, they don't validate.

But what about things like nesting block level tags within line level tags? Spaces before a ">"? Unclosed <td> or <h1> tags? My guess is that flaws like these have the potential to LESSEN your rank, because content that you think will be analyzed when the algo chews on your page may not included in the analysis at all. If the content's function on the page can't be sussed out, the text in that part of the HTML may simply be abandoned by the algo out of "self defense".

knighty

2:23 pm on Dec 13, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So as long as the spiders can read your content, and you do not have any errors in your code it should'nt make any difference to your rankings.

Or are we saying that validated pages will be ranked higher as some kind of reward??

I would imagine that the vast majority of pages on the web would fail the validation test.

Personally, I only design for web browsers so as long as it looks right and works cross browser I don't care.

I'm not going to pull even more hair out worrying wether or not it passes some obscure set of standards written by a bunch of nerds.

When my pages start failing in IE 16.5 and Opera 18 then I might start worrying. I can't see Browsers suddenly penalising poorly constructed web pages in the immediate future or the WWW will consist of about 5 pages.

tedster

4:33 pm on Dec 13, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> I can't see Browsers suddenly penalising poorly constructed web pages

You're right, of course, it will be a long and gradual journey, not a sudden one. But that journey has begun.

Current generation browsers are already checking the DTD and then rendering the page accordingly. The transitional HTML 4 DTD will be the saving grace for our legacy code for a long time. But at some point, the browser default setting will be for strict HTML.

To get the kind of extended functioning we want from future browsers, the requirement will be stricter code. The trade-off would bloated, buggy browsers that try to accommodate too much.

Brett_Tabke

5:11 am on Dec 27, 2001 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The argument for valid code isn't about se spiders, but about browsers. You can't test your code out on all the available browsers.

The newer browsers all spoof IE agent strings, so you don't have a clue who is using what browser anymore. Even the js agent is being spoofed by many browsers now.

It's one thing to use an extended html element for a specific browser like margin height, but it's another to count on the same error recovery across browsers and platforms. For example, take the infamous table errors in Netscape 4.7. One miss placed closing tag can render your page blank in nn4.7 while other ie/opera/etc users see the page in all it's glory.

Counting on the browsers to error correct the same way for invalid code is dicey.

idiotgirl

6:42 pm on Dec 30, 2001 (gmt 0)

10+ Year Member Top Contributors Of The Month



Valid code has become my waterloo this last week. Why I woke up one morning and decided I could become to do-all end-all of CSS mastery is about the dumbest idea that's ever possessed me. I have spent three straight days on ONE sitewide style sheet tweaking and cutting and nixing and pasting.

At one time I considered myself quite the little ticket as far as table layouts went. I felt there wasn't anything under the sun I couldn't do with a table.

Then I woke up and figured out that flexible content delivery for different browsers and devices is the only way to go - and, I s'pose, tables have no future in the real world. Or so the experts say... Hence the ongoing CSS fiasco.

I haven't been able to figure out which side of the fence to sit on for weeks about this whole css vs. tables thing and how I'm going to keep all the browsers happy.

I gave up on the results making me happy. My site hasn't been this plain since 1995.

Not to mention that unless you dedicate your life to box hacks your visions of tight layouts of your tables-of-the-past is a dreeeeeeam world!

I've been killing font tags, tr's, and td's faster than swatting flies in July - and my basic portal template has gone from what I once considered tight and pretty... to fuzzy slippers and and old bathrobe based on screen percentages and almost-valid CSS.

It may make the validator's happy but I can't stand to look at it the end result any more.

Oh, and it still doesn't validate. w3c is going to ban me any second, I can just feel it. I'm sure their logfiles say, "Oh, it's her again" next to my IP.

I'm about ready to throw in the towel and go text-only or WAP. ugh.

Hobbyist

2:09 pm on Dec 31, 2001 (gmt 0)

10+ Year Member



idiot girl

I don't know , I'll stick with tables for layout for now. I hear some browser support for CSS is still a little dicey. Plus I still see some very old browsers being used that definitely have no CSS support. You don't get that problems with tables.

I wouldn't rely on CSS for now IMHO.
Can lead to horrible results espically if the color fails to take effect.

I always test to see that even if css fails, the text is still readable.

Also I'm specifying 4.0 transitional, so that still allows the use of font tags. Strictly speaking no need for css at all..

Still css is pretty cool, if only that it helps reduce the size of your pages a little.

tedster

8:18 pm on Dec 31, 2001 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



CSS reduces more than file size, it also reduces the time a browser takes to render the page. In a simple layout this may not be very obvious. But for a more complex page, using CSS and divs can get content in front of your visitors' eyes several seconds faster.

Still, layout tables can be valid code -- no doubt about that -- and that's the issue we were discussing here. My own choice is a mix. I try to keep tables to a minimum, but sometimes they are still the best answer. However, the more I work with CSS, the more I discover that many of the limitations I thought it had were actually mine. FONT tags? I don't use them any more, period.

I agree with idiot girl that trying to serve up a universally readable site leads to VERY BORING PAGES. So I stopped worrying about being universally readable.

The improved conversion that comes from a somewhat "jazzy" page is important. Even if the total number of people who can see the page the way I intended goes down by 4%, if the conversion rates double on the remaining 96% - that's an improvement. It's the business purpose that drives me, not a love of pure code.

idiotgirl

8:44 pm on Dec 31, 2001 (gmt 0)

10+ Year Member Top Contributors Of The Month



Worked through the weekend on the godawful mess. I now have a hybridized version that - yes - uses tables to hold the header cell, inner mess, and footer cell, which is configured with CSS across three columns (left menu¦center content¦right menu).

The CSS is for padding, fancy font stuff, yadda yadda. But it isn't RELIANT on CSS for display. I stuck to plain old divs for formatting and straight <p>, <h3>, etc. for content.

I tried to make it so the content will display whether they're browsing text-only, or without CSS, or in full color with CSS enabled. The plain version's ugly without CSS but MAN did I cut down on my html code bloat.

In addition to CSS I also specified background, text, alink, vlink, etc. in the body tag of the page itself for "just in case". Don't know if that's kosher or considered redundant.

Probably won't validate, anyway :(

At this point I just can't go to straight CSS and the all the box hacks to get the job done. Too risky and too time consuming. When CSS only goes bad in a browser- it really goes to h*** in a handbasket.

w3c probably has me on their 10 most wanted list.

Brett_Tabke

6:50 am on Jan 1, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I don't see how you can get rid of tables at this point. Way too many browsers that don't support enough css or the right css to make it a universal solution.

re: css reduces download and render time.

Stands to reason doesn't it? It's one of the main arguments for css put forth by the css community. I don't think it is entirely true.

After you build up a quality css file, they can be very significant in size. CSS files in the 5 to 10k (or greater) range are not uncommon. I think if you go all out css, it will take at least 10k for an average site.

What's the most important page view you receive? I think it's the first view a new user ever sees at your site. It must be the fastest. Any download speedup by a CSS file does not occur on that first page view, it occurs on subsequent views.

There is also the fact, that the CSS file is another connection that a browser will have to make in order to download the file in the first place. Network latency and slow responding servers always make a second download stretch into more time. That's added page load time above and beyond if the code were inline.

Dicey browser caches: There are many ISP software setups out there (cable especially) where caching is a thing of the past. Compare your CSS file pulls - many users will dl that CSS file on every page view. No speed gain there.

Those that do cache the file, will often issue last modified checks on every request. That alone will add more time.

On the other side, there are sites that will see measurable gain from external CSS files. Take for example the forum here where we have a very high page-per-user ratio (averages 12-15 with 50-100 a day not uncommon). We could definitely see speed gains from using external CSS.

Net result? I don't think we can say that external CSS files effectively speed up a site in all (most) circumstances.

I've looked at the rendering issue too. Like you Idiotgirl (btw: we aren't buying the nick), I keep trying to eliminate tables. I just am not confident enough to do it in this environment.

Some times rendering speed is increased in one browser, and other times it is not. I think it all depends on the browser and system combo. The difference in negligible on my system.

tedster

8:33 am on Jan 1, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You've pretty much summed up the CSS issue, Brett. I just want to add that a 10k css file (text) usually gets some heavy duty modem compression, so it will beat a 10k image file dl by a good bit. Nevertheless, you've made some very valid points about time taken for that critical first page.

In terms of writing valid code (tables are still valid, after all) I keep a local validator on my machine and I use it right along as I write my HTML -- often before I view any changed code in the browser and always before an FTP. I do this mostly to catch code typos: unclosed <td> tags and things like that. I'm a copy and paste maniac with code snippets and I often get sloppy.

I don't keep the validator set for letter-perfect W3C code, but it does catch most major no-no's and it keeps me from chasing my own tail as I work on a site.

knighty

8:56 am on Jan 2, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hmmm

I always thought the main reason for using External CSS was to make your site a heck of a lot easier to change/modify as everything is controlled by ONE file.

tedster

2:47 pm on Jan 2, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's certainly one advantage for the author.

My understanding is that the MAIN purpose is to separate rendering instructions from the content and its logical structure. In this way, very complex visual and/or aural rendering becomes possible. Control of on-screen appearance becomes more fine-tuned, almost to the level we've reached in print materials. Printer instructions, aural browsers, hand-held devices all get a major boost.

As Brett points out, if you build up a very complex CSS it may mean MORE code, at least for the first download. But it will make building the code for various browser devices a more logical affair.

For instance, how often have you seen (or written) this kind of non-valid code:

<font size=-1>
<p>Lorem ipsum dolor sit amet.
<p>Consectetuer adipiscing eli.
<p>Sed diam nonummy nibh euismod tincidunt.
</font>

Instead of the very tedious, but proper:

<p><font size=-1>Lorem ipsum dolor sit amet.</font>
<p><font size=-1>Consectetuer adipiscing eli.</font>
<p><font size=-1>Sed diam nonummy nibh euismod tincidunt.</font>

The major visual browsers will render both example just fine. But being able to handle several block level elements (p) wrapped inside one inline element (font) has meant complex browser "spaghetti code" which contributes to browser bugs. The on-valid HTML can also confound non-visual browsers.

Moving the rendering instructions for <p> into a style sheet cleans up the content and enables different kinds of rendering devices. If you create a very consistent "look" from page to page, this might mean simpler maintenance. That's your choice as an author. But if you really delve into fine-tuned CSS instructions, you may find that all the new possibilities make things even more complex.

There's quite a crazy genie in the bottle we call stylesheets.

papabaer

9:51 pm on Jan 3, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hello Everyone! I do a lot of CSS experimentation, I do have some global CSS files, many shared, and quite a few page unique files. Again, I am trying many different things.

Most of my CSS in embedded in the <head> section of my pages, again, primarily for me to tweak... Of the files I have uploaded to my styles folder, the largest is just 4kb and that has everthing in it but the kitchen sink! Most are 1kb.

The page-weight reduction when switching to CSS and eliminating all the old clutter is remarkable... You should also look at it from one more perspective, the code vs. content ratio as applied to SEO, again, a pretty dramatic variance.

I have not used <FONT> tags for over a year. I also vaidate new pages using XHTML 1.0 Transitional. The code is CLEAN and very easy to go back to if I want to update.

I think one of the main problems in adopting CSS and validation is a matter of perspective. It is like the old joke that ends with, "you can't get there from here!"

Change the mindset first. CSS gives a designer tremendous freedom, and that, unfortunately is not something we have been used to.

The harder I try, the luckier I get....

Brett_Tabke

2:39 pm on Jan 12, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Code vs on-the-page content is very appealing. The problem I run into is there are so many exceptions to the 'template' approach. If you have a varied site at all, it is going to take quite a few specific classes.

The biggest problem I still have with CSS is I find most sites done in CSS don't look as good than if they had been done in stock html.

Xoc

4:26 am on Jan 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use CSS almost exclusively for fonts and colors. So my pages look just about exactly as good as pure HTML. I don't rely on any of the margin stuff, positioning stuff, or any of the other stuff that CSS advertises, but doesn't work. If you stick to fonts and colors, it works great, and cross-browser. For the older browsers (and the IE in my new iPaq), they gracefully degrade by ignoring the CSS, so the text comes out black in a standard size.