Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

robots.txt as Sandbox Guardian

control release of large page group with robot.s.txt

         

wattsnew

2:09 pm on May 13, 2005 (gmt 0)

10+ Year Member



My site's size will increase by 60% with a new product presentation next week (e.g from 100 pages to 160). It could be sandboxed if all the pages are indexed at once. It will not make sense to visitors if released over days, weeks.

Suppose the new pages are individually disallowed by robots.txt initially, then the disallow is removed a few pages at a time.

Anyone tried this as a sandbox saving manoeuvre?

Thanks!

wattsnew

2:39 pm on May 13, 2005 (gmt 0)

10+ Year Member



Two additional points:

I don't need traffic from the new pages for awhile so non-indexing is not a problem.

But I do need the full nav bar on the pages for visitors - which will lead from "indexed" (allowed) to "non-indexed" (disallowed) new pages. That might mean that GoogleBot will show all of the URLs w/o snippets?