Forum Moderators: open
Over the last few months, I've redesigned my entire site using XHTML and CSS, and while I still have a lot to learn and lot to improve, I'm pretty happy with my progress.
The way my site pages are currently laid out is like this:
BBBBBBBBBBBBBBBBBBBBBBBBBBB
BBBBBBBBBBBBBBBBBBBBBBBBBBB
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC TTTTTTT
LLLLLL CCCCCCCCCCCC TTTTTTT
LLLLLL CCCCCCCCCCCC TTTTTTT
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL CCCCCCCCCCCC RRRRRRR
LLLLLL FFFFFFFFFFFF RRRRRRR
LLLLLL FFFFFFFFFFFF RRRRRRR
B = title banner (changes by page)
L = lefthand menu (static)
C = page content (changes, obviously)
R = righthand options (static)
T = targeted content (changes)
F = footer (static)
Right now, I'm using Dreamweaver, and using templates and library files to put the fixed elements on each page.
However, I'm contemplating using SSI includes instead, for the following reasons:
1) Faster updates... no need to FTP the entire site when I change the foote.
2) Less clutter for google and other search engines (they see the content, and not the menus).
3) Faster page loading for visitors.
#1 is indisputable.
How about #2 and #3, though?
In particular, I'm very torn about the third issue. On one hand, I'd think that pages'd load faster for folks overall, since once they loaded the first page(set), all subsequent page views would load the included files from cache. Is this assumption correct? And would there be significant downsides to loading 3-4 files for every viewed page?
Thanks very much in advance for your advice!
EDITED TO ADD:
Hmm... I think I've partly answered my own question.
An FAQ page on Apache notes that, under normal circumstances, parsed files are not cached:
[httpd.apache.org...]
The page goes on to mention that using an xbithack trick, it MAY be possible to cache parsed files, but this part isn't clear to me :(
Oh, and one other thing: What about php? I have access to this on the server that hosts my site. Might this be a better alternative for including files than SSI?
[edited by: ThatAdamGuy at 6:32 am (utc) on Jan. 7, 2003]
Not sure about your point #2. Google doesn't index SSI files but it crawls happily around my site. I have a lot of text based links though - one day I must drop a link in the SSI that appears nowhere else to see if Google picks it up (too scared not to have content text links for most links :)).
Your point #3 - faster than what? Previously your visitors' browsers would have cached the menus locally anyway so you probably won't notice much of a difference in this respect (I think?).
I've read there is a slight negative aspect of SSI in that the server has to parse the pages for includes each time, which is a very slight resource/time imposition (only milliseconds, from memory). Overall though, they are like black magic when updating a site ;).
[edited by: aus_dave at 6:31 am (utc) on Jan. 7, 2003]
But, the server will still output the include code into the page it serves for all visitors, human, Googlebot or any other spider.
Download time is roughly the same, give or take a little extra server response time. Include files are not downloaded separately or cached separately - they are written into the code that gets sent at your server.