Forum Moderators: Robert Charlton & goodroi
Suppose the new pages are individually disallowed by robots.txt initially, then the disallow is removed a few pages at a time.
Anyone tried this as a sandbox saving manoeuvre?
Thanks!
I don't need traffic from the new pages for awhile so non-indexing is not a problem.
But I do need the full nav bar on the pages for visitors - which will lead from "indexed" (allowed) to "non-indexed" (disallowed) new pages. That might mean that GoogleBot will show all of the URLs w/o snippets?