homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Hardware and OS Related Technologies / Website Technology Issues
Forum Library, Charter, Moderators: phranque

Website Technology Issues Forum

Same directory=Increased Speed
What do you think of this? And why?

 7:03 pm on Mar 30, 2010 (gmt 0)

I'm (normally) a daily attendee of the Daily Sucker. Vincent's been off ill, but the 25th he released this little tidbit [webpagesthatsuck.com]:

if you throw all your HTML, Javascript, CSS and graphic files in the same directory ... your pages will load 20-30% faster.

Huh? Why would it? Is this one of those old well known facts (yes, the sky IS blue) I've long forgotten, an accidental discovery, or a case of specific conditions and "your results may vary?"



 7:38 pm on Mar 30, 2010 (gmt 0)

Navigating to a subdirectory likely takes some time. 20-30% though? That's quite a difference.


 8:42 pm on Mar 30, 2010 (gmt 0)

Yeah, but it's not navigation. It's like the difference between these requests:


Unless there's something demanding of the server in the path, the reasons these would differ completely fail me.

If there is any truth to this, this changes a lot of things for a lot of sites, I'd think.


 9:03 pm on Mar 30, 2010 (gmt 0)

There is one reason I can think of why uploading everything to one directory might give a (small) speed difference. If the files are at the root level, only one .htaccess file is parsed with Apache. If a file is deep in a directory structure, all the .htaccess files are parsed recursively. But 20-30% seems to be a lot to me. There are too much other variables which have to be taken into account to come to such a definite conclusion IMO.


 4:45 pm on Mar 31, 2010 (gmt 0)

The .htaccess is a good point. But once the server's disk cache has been primed, and assuming the server has enough memory, I can't believe there's any difference on the server side. Especially in Linux, which is particularly good at quick filesystem access and parsing text files.

I wonder if there's a difference on the client side? Does it handle absolute and relative links differently? Do some links take longer to process in the browser cache?

Regardless, as long as your site feels fast, I would definitely go for easier organization and maintenance rather than raw speed.


 7:49 am on Apr 3, 2010 (gmt 0)

From the OP:
if you throw all your HTML, Javascript, CSS and graphic files in the same directory (and modify your files to account for this), your pages will load 20-30% faster.

You missed one pertinent bit:
Of course, I wasn’t a math major.

If those figures [webpagesthatsuck.com] were accompanied by something even remotely resembling stats (e.g. average based on 100 tests) then I'd be tempted to wonder how and why there might be such a discrepancy

In the meantime, I'll stick with the 80/20 approach to site maintenance; concentrating my efforts on content and presentation


 6:08 pm on Apr 3, 2010 (gmt 0)

<snicker> Me neither, but

2517ms (empty cache) 1864ms (empty cache)

1864/2517 = 74.05% (give or take) which is right around 26% (-ish.)

Agreed, maintenance and logic dictates a directory organization takes precedence, and we're talking about milliseconds of difference, you can gain more by sucking out white space and optimizing whatever images you have. :-) Then there's the issue with huge sites, and the problems with thousands of files in root.

I'm just curious as to the why of it, and wonder if anyone has ever seen evidence of this and know why it would be.

brotherhood of LAN

 6:30 pm on Apr 3, 2010 (gmt 0)

Doesn't look like the times take into account the DNS lookup?


 11:26 am on Apr 11, 2010 (gmt 0)

I'm guessing, but I think it's more likely to do with browser connections and it could be totally dependent on the specific browser (or software) used to do the testing rather than the site actually loading faster one way or the other for most people.

I'm not sure what he used to create the download speeds shown in the JPGs liked, but I would guess it has to do with connections persisting more than anything else and I think a quality browser would persist the connections and load either version at the same speed, unless the use of full URLs with a different directory rather than relative throws some of them for a loop or something goofy along those lines, but IMO if that were the case it would be something the browser manufacturers would fix in a hurry, because they're all trying to be the fastest, so I think my official vote is: The software he used for timing did not persist the original connections when there was a directory change and it resulted in the loading of files from different directories taking more time, because what ever he used created a new connection when the directory changed rather than reusing the original(s) and making another request.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Hardware and OS Related Technologies / Website Technology Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved