Welcome to WebmasterWorld Guest from 220.127.116.11
The issue I'm grappling with is that the most effecient method would be to write one template file and parse the various XML files as needed. The problem is that the SEs will only see the template file.
So what I'm looking for is a way to show links to the Search Engines that look like somefile.xml or somefile.html and when activated, run the template script and deliver the correct HTML.
I've heard this could be done via .htaccess and/or mod_rewrite. Before I go try to figure out exactly what I need to do, I'd like to know if:
1. what I want to do makes sense
2. may get me into trouble with the SEs
3. could be done differently/more efficiently
As always, your insight would be most appreciated!
php reads the url, cuts out $cat = cat1 and $sub = sub1 and then goes and finds all the info about it. I do this for menus and content a lot. The script figures out where it is in the directory structure and loads all pertinent info based on the url.
Similar to mod_rewrite and similar to get strings but all done with pure php and mysql.
I don't know, maybe mod_rewrite is easier depending on the number of pages/links you need to have.
Just the way I have done things a lot of times, it seems to make perfect sense when it's in my head but I don't seem capable of explaining today. Think I will have to get more coffee.
I guess my answers to your questions were something like this
3. differently - yes
more efficiently - don't know, probably equal
BTW if you want to use mod_php then the MIME type would need to be application/x-httpd-php not application/x-httpd-cgi.
If all your pages are produced by the same script then a rewrite rule like this will be more generic.
RewriteRule ^(.*)\.html$ foo.php?f=$1.xml