Sgt_Kickaxe - 2:40 am on Dec 13, 2012 (gmt 0)
My *short* list has 42 right now but I don't know about ALL incoming links to pages that are 7 years old so there might be a few more. Logs helped me find the ones that bring traffic and I'm using GWT and various SEO services to try and find *must keep* backlinks, not an easy task.
The important pages are not going to change in time, they haven't changed in years, so if the number stays under 50 can I do what Lucy suggested and then also create a static copy that will bypass the database entirely to pick up performance? I'm concerned about performance, I haven't pushed an .htaccess file to know where limits start to get crossed. Will 50 of the following be better served with the php methods you described or will htaccess handle this... e.g.
RewriteRule ^index\.php/(goodname1|goodname2|goodname3|goodname4)$ http://www.example.com/$1 [R=301,L]
RewriteRule ^index\.php/(goodname5|goodname6|goodname7|goodname8)$ http://www.example.com/$1 [R=301,L]
RewriteRule ^goodname1$ /cache/goodname1.html [L]
RewriteRule ^goodname2$ /cache/goodname2.html [L]
RewriteRule ^goodname3$ /cache/goodname3.html [L]
etc. Any loss due to htaccess should be offset by much faster load times... or is this already pushing it?