Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Removal Tool to resolve supplemental/duplicate troubles

Looking for experienced advice...

         

phish

12:21 am on Dec 29, 2006 (gmt 0)

10+ Year Member



Hi,
I'm currently trying to solve a supplemental dup content issue here with both a forum, and a shopping cart. First is the shopping cart...it's an old cart which has since been replaced with another cart. We left the exisiting cart in place, simply because we had used it for about 3 years, and knew people had it bookmarked. It fell to supplemental hell, about 300+ pages, so we removed any instance of the cart (links) from the site, and blocked G via robots.txt. Now 1 year later they removed the pages, and all seemed fine, but no..somehow they dug up about a half dozen of the pages again, (again supplemental). Secondly, we started a forum, just to keep customers up to speed on the happenings of the site, that went way wrong, and was almost immediately supplemental because of dup content which the software itself creates. We can live just fine without either of these..my question is since both are being blocked via robots, can i use the removal tool, and will these be gone permanently? Or, will G pick them back up somewhere somehow in the future? Should I just dump all of them (404) then use the removal tool to speed up the process? Any help is greatly appreciated.

shogun_ro

12:36 am on Dec 29, 2006 (gmt 0)

10+ Year Member



At this address [services.google.com:8882...]
You can remove outdated url's if they no longer exist and return 404 header.
In 5 day's all the links will disapear from G index.
You'll get an email to inform about this from G.

phish

12:46 am on Dec 29, 2006 (gmt 0)

10+ Year Member



"if they no longer exist"

i guess my question was can the two co-exist? If I leave the pages up, block them via robots, then remove them with the url-console, will they be gone from the index? Theoretically it sounds right to me. Unless of course G doesn't follow robots directives.

zegiv

12:54 am on Dec 29, 2006 (gmt 0)

10+ Year Member



You have to put a no-index meta in the <head> section of all the old pages, and then you have to use consolurl to remove them from Google index.

With this, you can have your old pages and your new pages on your website and Google will only index the news.

shogun_ro

1:06 am on Dec 29, 2006 (gmt 0)

10+ Year Member



Exactly!
Here are the details:
[google.com...]
Happy New Year! :)

crobb305

1:36 am on Dec 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would be very careful about using the removal tool. You may inadvertently remove your intended pages/homepage. Furthermore, some speculate that the removal tool only removes pages from public view. I have no evidence of this, however.

cabier

3:59 am on Dec 29, 2006 (gmt 0)

10+ Year Member



I was using same database for my 2 websites. Only template was different. Luckily, I haven't had any duplicate content penalty. But last week, I redirect my second website with 301 redirection to another website. I suppose that, I will no longer face with duplicate content penalty. Then, you can use 301 redirection for this problem I think.

zegiv

9:07 am on Dec 29, 2006 (gmt 0)

10+ Year Member




I would be very careful about using the removal tool. You may inadvertently remove your intended pages/homepage.

You can't delete a page if it's not 404 or if it not have the no-index meta. And this condition has to be true when you submit this query and must to be true 5 days later when Google will delete the page from it's index.

I've delete 12.000 pages on two months, if you try to delete a valid page (200 with not no-index on it's headers) you will have an error and Google will do nothing more.

Think that Google not required an identification on your site, so if it was possible everybody may delete all pages in Google index.