| 8:13 am on Feb 26, 2007 (gmt 0)|
can anyone please help me?
| 8:38 am on Feb 26, 2007 (gmt 0)|
Block googlebot from the old domain with robots.txt to avoid duplicate problems.
| 9:58 am on Feb 26, 2007 (gmt 0)|
|I have some content in one of my x site which is cached and right now not generating google traffic. the content in the site is very genuine. Now my client wants to move it to a new site altogether, by decaching the old cached pages in google and moving the same content to another site as he feels that this content is more relevant in the other site y than x. |
|Block googlebot from the old domain with robots.txt to avoid duplicate problems. |
The indexed content pages will be there in Google cache, I doubt blocking pages of old domain in robot.txt will resolve the duplicate content issue, will it?
| 9:07 pm on Feb 26, 2007 (gmt 0)|
I've done it - it worked, and it's also supposed to work. Yes, Google keeps a copy of a previously cached document "somewhere" or other, but the robots.txt blocked domain will get removed from active scoring.
When in doubt, turn to the source:
| 9:11 pm on Feb 26, 2007 (gmt 0)|
I can confirm it works also. First Google 404's the blocked url and sends it to supplemental results. I believe very quickly the old page is removed from scoring.
| 12:59 pm on Feb 27, 2007 (gmt 0)|
Thanks Tedster and CainIV for the replies, now I am much clear with the issue
| 11:41 am on Mar 1, 2007 (gmt 0)|
Thanks a ton to every one for contributing