sinyala1 - 11:55 am on May 6, 2002 (gmt 0)
A cloaking script works by looking in a database of spiders and either finds it in the database and redirects it to the cloaked site or it doesn't find it and redirects to the normal site. My site is in php so all the documents are php. If you programmed a php file that would check to see if it's a spider or normal web surfer and if it found out it was a user it would redirect it to the normal website and put this before the html tags on all your cloaked pages a normal web surfer wouldn't be able to see your cloaked site. Cached or not it would still be redirected before he could try to see it because it's server side. Now all you'd do is just do is include that php file that checks to see if it's a user or not in all your cloaked pages. But that pragma would be the best way to stop search engines. Does not allowing se's from caching your site hurt your rankings?