Forum Moderators: open
First of all, thanks a million to everybody working on this portal. This place is amazing. I'm so happy I found it.
I had posted the same CSS question in like 4 other forums, and I received my first answer HERE, even though I posted the other questions in the other forums like *hours* before posting here.
Second, another thing that's making me fall in love with WebmasterWorld is the speed. Gosh, these forums are super fast. I frequent many other forums in other places, but I've never found something faster than this. So may I know the secret, please? I know it's not like you can teach me to develop the same thing by replying to this post, but I mean what technologies exactly are used to produce such results? :)
Thank you, everyone.
Thank you, freeflight.
Could you also tell me about this "compressed pages" thing? Is it only for high-end websites that have heavy traffic? Or do normal websites use it sometimes? How expensive is it or technically hard to administer, for a forums' oriented website without a big budget?
To paraphrase the great American talk show host who had a long career in the middle of massive competition: "Do one thing and do it better than anyone else. Don't worry about what the other guys are doing - take care of what you do best and everything else will take care of itself." - Johnny Carson 1969.
FlatFiles. They still smoke the doors off every db systemberkley db is fast, yes... but... nowadays with modern dual CPUs the bottleneck is usually disk IO, with 500 sql queries/sec the main DB server here with 6 scsi disks is 60% idle while the disks are at 85% util and most (if not all) high traffic sites run into exactly the same issue (and after that the internal network is next).
tell me about this "compressed pages" thing?http://sourceforge.net/projects/mod-gzip/
you can also do it 'by hand' - in a mod_perl enviroment it would look like this (client sends 'Accept-Encoding: gzip'):
$html = Compress::Zlib::memGzip($html);
$r->header_out('Content-Encoding', 'gzip');
$r->send_http_header();
$r->print($html);
Easy!
The bottle neck on a good db has little to do with disk i/o speed and everything to do with the efficency of the lookup algo.
I stick to flat files because file systems are the oldest and most efficient database system available - we let it do the work.
The slowest part of the system here is the actual perl that runs the show. C or ideally ML is the only way to go faster.
perl: precompiled perl (e.g. mod_perl) / precompiled php is amazingly fast... I don't think C would be that much faster (unless you don't need to use perl's great features such as hashes/regex etc.) - per dedicated perl¦php dual xeon 2.8 box I am able to serve 65-70 fully dynamic+gziped sql page requests per sec here (both mod_perl / php with much more 'junk' (userpics, friends etc.) than WebmasterWorld has), high traffic forums make about the same, wikipedia a little bit less but overall even more thanks to intelligent squid/proxy caching.
(*) actually that might have been the issue since you have to sort through many old threads on the index pages