Forum Moderators: phranque
RewriteRule ^(.*)/(.*)/guestbook/(.*)-(.*).htm$ guestbook/posts.php?group=$1&location=$2&hotelname=$3&offset=$4 [nc]
RewriteRule ^(.*)/(.*)/guestbook2/(.*)-(.*).htm$ guestbook/posts2.php?group=$1&location=$2&hotelname=$3&offset=$4 [nc]
which are working perfectly along with my robots.txt in order to 'hide' the original php files....
however, I would like to have a 301, L redirection from the /guestbbok2/ folder to the /guestbook/ folder in order to send all spiders to the second one (guestbook)...I want guestbook 2 to be invisible...is this rule correct or do I need something else?
RewriteRule ^(.*)/(.*)/guestbook2/(.*)-(.*).htm$ (.*)/(.*)/guestbook/(.*)-(.*).htm [R=301,L]
thx in advance, any ideas will be appreciated
What about users? If you are redirecting only spiders, then this is cloaking, and brings in many more complications that you may be prepared to deal with. You will need to maintain an up-to-the-minute list of spider IP addresses (through your own daily research or by paying for it) in order to avoid being detected and possibly banned. Is that what you want to do?
Jim
My main target is to avoid spiders reaching the second geustbook folder since both guestbooks have the same content and I mmight get penalized...(duplicate content)...so I see two ways...redirect all guestbook2 urlz to guestbook (for spiders only, not users) by using 301, L OR use a robots.txt rule to 'hide' the gustbook2 folder....what do u think?
thx in advance again
Otherwise, you will need to implement --and maintain daily-- a complete list of all search engine robots' IP addresses and user-agent names. If you miss one and it spiders your duplicate content, that's going to be a problem.
Jim
one last question plz...
RewriteRule ^(.*)/(.*)/guestbook2/(.*)-(.*).htm$ guestbook/posts2.php?group=$1&location=$2&hotelname=$3&offset=$4 [nc]
As u can see the guestoob2 folder is not an existing folder on the root, it is rather a folder created by htaccess and it is precceded by two other folders before (^(.*)/(.*)/guestbook2/.....)which are dynamic...is there any way I can use a wildcard for my robots.txt rule to apply for all?
something like:
Disallow: */*/guestbook2/
Thx again very much!
[google.com...]
[help.yahoo.com...]
[search.msn.com...]
Jim