|Performance hit for Redirect 301?|
All I've heard is theory -- who's got numbers?
So conventional wisdom has it that the more Redirect 301's you have in your .htaccess file the slower your server will be at delivering pages, because it has to check all those Redirectives first. But how much slower? I looked all over the SERPs and WebmasterWorld but I couldn't find anything useful. I'm particularly interested in a comparison, such as speed to deliver pages with 0, 50, 100, and 500 Redirects respectively. Anyone know where to find information like that?
How about on your server? That's the best place to test for impact, since all servers and sites will be different. Cut and paste 500 copies of a redirect that refers to a non-existent file, so it won't interfere with any real files on your server, and then measure the impact that checking those 500 redirects has on serving your non-redirected pages.
Also, if there is any pattern to your redirects, you can often handle more than one file with a single redirect.
I added about 20 redirects, and at least 300 IP address and user-agent checks to my .htaccess over time, and couldn't see any difference in performance -- the server execution time is still completely swamped by internet transmission time. Many sites are completely script-based, so a few more lines of server-native scripting in .htaccess won't make much difference.
I would be interested in seeing your results, though.
Servers are not my strong suit -- I don't know how to how to run such a test myself.
Uh, I guess I could have a Perl script get the time, call a redirected page, and then have the landing page get the time after it loads so I can compute elapsed time?
That's one of the reasons it's better to put your redirects and other .htaccess directives directly into your httpd.conf file, if you have access to it. Anything in that file is loaded into the server's memory when Apache is restarted, so the server knows what needs to be redirected and doesn't have to read your .htaccess file everytime someone requests a page.
Although really, any server should be able to handle even a fairly large .htaccess file without any noticeable overhead. If your .htaccess file is resulting in a noticeable performance hit, you probably need to upgrade your server.
I'm afraid I don't have access to the httpd file, and I'm not experiencing a "noticable" performance hit, but it's kind of hard to notice these things. Anyway, I cracked open my copy of the Perl Cookbook and did some simple benchmarks.
Bear in mind that my server is part of a shared hosting plan with who knows how many other websites running on it so my tests weren't exactly part of a controlled environment. Still, the conclusion seems to be clear: No noticable performance hit from using 100 redirects vs. 0, or using <AddHandler server-parsed .html> for SSI, vs. not having that line in the file.
use Time::HiRes qw(gettimeofday);
print "Content-type: text/html\n\n";
$t0 = gettimeofday;
$t1 = gettimeofday;
$elapsed = $t1-$t0;
print '<P>' . $elapsed;
Times are in seconds. I ran each test six times.
Add: A redirect for the file I'm processing
0.247492074966431 << Is it only slow the first time because it gets cached?
Add: 99 redirects before the one that will actually be called
.htaccess contains only: AddHandler server-parsed .html
Call a 24k file with an SSI directive in it
0.1545729637146 << Is it only slow the first time because it gets cached?
Empty .htaccess file
Call the same 24k file