| 5:44 pm on Jun 25, 2002 (gmt 0)|
No problem with the number of includes. Many sites have dozens of includes for a single page before the page even starts rendering HTML.
If you want to know, you can combine your includes into fewer files and try it that way. Run Apache Bench on both versions and see what the difference is. I suspect it will be just a couple of microseconds. Perhaps if you were Google, this woudl matter, but at 12,000 page-views per month, I suspect that getting rid of comments and whitespace in the final HTML will save you a lot more time than getting rid of three includes.
| 5:50 pm on Jun 25, 2002 (gmt 0)|
I agree with ergophobe's assessment.
I have a few sites that use, at least, five includes and have no problems. If you start nesting includes within included files you can run into trouble but I wouldn't worry about the setup you have.
I think I have one site that uses 14 for a section of it, that might be a little excessive, but I still have no problems. They are plain vanilla pages so it isn't a big deal.
| 5:54 pm on Jun 25, 2002 (gmt 0)|
That's what I expected, but I just needed to hear it from somebody who knows what's what before re-doing all these pages...
| 5:56 pm on Jun 25, 2002 (gmt 0)|
I use several layers of includes, easily a dozen for each page I render, since it makes coding easier and updating content much easier I won't use less. Perhaps it is a little slower, but with a good webserver it makes up for it.
Do what makes the most sense, if you make database calls, they will be what cause the page to slow down the most and will basically nullify the slowdown of includes.
| 5:59 pm on Jun 25, 2002 (gmt 0)|
nesting includes within included files you can run into trouble
Once again, WMW is so cool - a "simple newbie question" always seems to bring up some interesting issue!
Anyway, Jatar, what sort of problems have you run into? I do this all the time. The only problems I have run into is when there is a logic error in the script and a function declaration gets included twice (which causes PHP to choke). However, you can avoid this by
1. writing decent code... but that's so much effort ;)
2. using include_once
I have not benchmarked nested versus non-nested includes. I would think that it's the total number of disk accesses that matter, though, not where one calls the include from.
| 6:24 pm on Jun 25, 2002 (gmt 0)|
I have worked on a few large perl and php driven sites where the original designers felt free to include all of their libraries deeply nested and have slowed their page loading to a crawl.
imho, for every level deeper you nest it does increase the processing time more than if you doubled the size of the first included file.
| 7:06 pm on Jun 25, 2002 (gmt 0)|
I think what you saw was the result of including huge libraries where 90% of the library wasn't used on a given page. Just my guess, but I don't know whether nesting really does slow things down. I have this project in my head to do some PHP benchmarking and I'll add that to my list of things to test.
I should note that my initial tests suggest that, unless you're on a heavily loaded server, the main bottleneck is typically going to be the user's phone line (until broadband becomes more general) which means that it's usually better to focus on extra bits in the final page than on keeping the script light.
Not advocating bloat, just a prioritized list.
| 7:19 pm on Jun 25, 2002 (gmt 0)|
personal experience, no exact times
website using includes 3 levels deep. I was on T1 and the pages were too slow. If you notice on T1 there is a problem. The included files were only html with a few if's and were just reusables menus etc. I took the includes from three levels deep to 1 and the pages were lightning fast even on 56K. Now, no code changed, size of files didn't change. All code was necessary. All if statements were still processed.
My understanding is when a server hits an include it stops parsing that page looks for the file parses it and then returns to finish parsing the original file.
If you have multi level nested includes it is continually stopping parsing looking for files and returning only to go off and look for another file. The time isn't eaten up in actual parsing time because in my example it spent all of the extra time runnung around looking for files. Just a rudimentary understanding but imho it is always the case.