Forum Moderators: open
Just wondering if everyone can clarify something for me. We all know that Web pages should be written so that they load in a browser as quickly as possible. But I've been reading conflicting advice on how to do this. On the one hand, some people say that we should remove all embedded Javascripts and CSS formatting, put them into separate files, link to them instead, and because this reduces the raw size of the file, the page will then load quicker. On the other hand, other people say that each time the browser has to request an external file, it slows down the loading process. Is there a definitive answer? Thanks for your time.
each time the browser has to request an external file, it slows down the loading process.
I've heard this argument before and it is valid. A new TCP connection will take time to establish. However all the DNS resolution is already done so I seriously doubt we are talking about a significant amount of time and I'll doubt you'll see any measurable difference.
And on some connections it will actually be faster because the html and external file can be downloaded in parallel.
However, the most important reasons are those suggested by AmericanBulldog, SEO and caching.
Look at it this way: if your scripts and css code is so gosh-awful large that you are worrying about it affecting load times... well, then you have some major "de-bulking" to do.
By all means, keep as much code separated from content as you are able. Keep your text relevant, and stay focused.
Code clean, write well, design with intent...
Best of luck!
Just to add to what has already been said: Separating out the css, etc means that those on special browsers don't need to download it - e.g. PDAs or browsers for the blind. i.e. From a purist perspective, just extending the notion of separating out presentation from content.
It also forces you to think with a view to re-use (defining classes, rather than defining things in-line) and that makes for smaller code.
I don't think it will have an effect on the amount that a search engine needs to download, though. The search-bot still needs to check that the site is not trying to play silly tricks, like make a whole bunch of content for the benefit of the search-bot only, and hidden to natural people. (At least, I hope that most search-bots do check, but I am not an expert...).
I agree that the overhead of an extra file is not significant. But I don't think that is really the issue anyway. We should think more in terms of bandwidth usage which drives speed, rather than raw speed. If raw speed was that much of an issue, there would be far less sites driven by php, cgi, etc, with a move towards templated static sites.
Also try doing a 'site search' using search terms such as 'reduce size'. Here are some other threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
Shawn