homepage Welcome to WebmasterWorld Guest from 54.145.183.126
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
decaching old content and moving to the new site
How will google treat this?
seoram

5+ Year Member



 
Msg#: 3258714 posted 11:32 am on Feb 21, 2007 (gmt 0)

Hi,

I have some content in one of my x site which is cached and right now not generating google traffic. the content in the site is very genuine. Now my client wants to move it to a new site altogether, by decaching the old cached pages in google and moving the same content to another site as he feels that this content is more relevant in the other site y than x.
He also doesnot want to stop the old site and do temp redirection to the new site. He is not interested in doing the redirection to the concerned folder also.
I would like to know if google will read this as a duplicate content?
Thanks for sharing your ideas in advance.

 

seoram

5+ Year Member



 
Msg#: 3258714 posted 8:13 am on Feb 26, 2007 (gmt 0)

can anyone please help me?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3258714 posted 8:38 am on Feb 26, 2007 (gmt 0)

Block googlebot from the old domain with robots.txt to avoid duplicate problems.

ramachandra

5+ Year Member



 
Msg#: 3258714 posted 9:58 am on Feb 26, 2007 (gmt 0)

I have some content in one of my x site which is cached and right now not generating google traffic. the content in the site is very genuine. Now my client wants to move it to a new site altogether, by decaching the old cached pages in google and moving the same content to another site as he feels that this content is more relevant in the other site y than x.

Block googlebot from the old domain with robots.txt to avoid duplicate problems.

The indexed content pages will be there in Google cache, I doubt blocking pages of old domain in robot.txt will resolve the duplicate content issue, will it?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3258714 posted 9:07 pm on Feb 26, 2007 (gmt 0)

I've done it - it worked, and it's also supposed to work. Yes, Google keeps a copy of a previously cached document "somewhere" or other, but the robots.txt blocked domain will get removed from active scoring.

When in doubt, turn to the source:
[google.com...]

CainIV

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3258714 posted 9:11 pm on Feb 26, 2007 (gmt 0)

I can confirm it works also. First Google 404's the blocked url and sends it to supplemental results. I believe very quickly the old page is removed from scoring.

ramachandra

5+ Year Member



 
Msg#: 3258714 posted 12:59 pm on Feb 27, 2007 (gmt 0)

Thanks Tedster and CainIV for the replies, now I am much clear with the issue

seoram

5+ Year Member



 
Msg#: 3258714 posted 11:41 am on Mar 1, 2007 (gmt 0)

Thanks a ton to every one for contributing

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved