Forum Moderators: phranque

Message Too Old, No Replies

XBitHack vs. server parsing or changing file extensions

Will this create problems with the search engines?

         

AndyA

6:28 pm on Mar 21, 2008 (gmt 0)

10+ Year Member



I have about 100 pages that I need to do an include on, as they will need to be changed regularly going forward. They are currently using .html extensions, and are well ranked in all the search engines.

I don't want to change the extensions to .shtml to do SSIs, nor do I want to load down the server by having it parse all html files, and I've read the XBitHack allows the server to parse only files with 744 permissions.

I've read that doing this has caused problems with some pages losing their ranking in the search engines. So, instead of using "XBitHack on" in htaccess, (which apparently prevents last modified headers), you should use "XBitHack full" to allow last modified on pages that have the group execute bit set. Is this a security risk?

This is new territory for me, so I want to make sure this is correct. Any input will be appreciated.

jdMorgan

4:43 am on Mar 22, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've got several top-ranked pages using XBitHack full, so no worries there, as long as you have the X bit set for both "owner" and "group" (754).

On some hosting setups, these bits get reset or modified whenever a new version is uploaded, and it becomes burdensome to have to remember to check them or re-set them every time you upload. On such hosting accounts, I suggest renaming the files to .shtml, and using mod_rewrite to internally rewrite requests for page.html to page.shtml. This can be done one page at a time, or in a grouped rule, or completely automatically -- For example, in .htaccess:


# Internally rewrite .html URL to .shtml file if .shtml file exists
RewriteCond %{DOCUMENT_ROOT}/$1.shtml -f
RewriteRule ^(.+)\.html$ /$1.shtml [L]

Since the SSI processing depends on the filename and not on the URL, these methods work well.

If you have only a few pages you need to do this for, use a "manual" approach; The "automatic" code above is expensive in terms of CPU cycles, because it will call the filesystem to do an extra "file exists" check for every .html URL requested from your server.

Jim

AndyA

4:13 pm on Mar 28, 2008 (gmt 0)

10+ Year Member



Thanks for the information, Jim.

I'm going to try to do what's easiest on the server, because I don't want to slow it down any. So I'll attempt the XBitHack first and just change the permissions on the pages that need to be server parsed.

Hopefully I can set up my FTP program to remember the permissions on the files that have been changed, so when I upload them it will be automatic.

I appreciate the assistance!

[edited by: AndyA at 4:15 pm (utc) on Mar. 28, 2008]