Forum Moderators: Robert Charlton & goodroi
I updated the robots.txt to get rid of some dead links. I am as paranoid as the next person, so I tested the robots.txt with googles analysis tool before submitting it to the URL Removal tool.
According to the Google URL Removal tool, I requested the removal of my entire site. Which I didn't. Then exactly one minute and one second later the log shows the robots.txt file that I did submit. For some reason it lists the contents of the file twice.
I tried to reproduce this with the same robots.txt and a couple of throw away sites. Both times around, Google behaved as I originally had expected it to. The robots.txt was listed once when submitted, and the root domain was not listed for removal. In fact, when I deliberately tried to remove the root domain as a dead URL, google correctly indicated that the page was still there.
I have read about others using this tool without problems, but from now on will be using a 410 in the .htaccess and using a little more patience.
As there is nothing I can do for six months, I guess this is the perfect time to finally change my domain name.
So, Google has to check the site from time to time to see that the page is still gone at that URL, or whether some new content has appeared there.
If it worked any other way, then once you let a page go 404 and it was deindexed, then you would never be able to put a page online at that URL ever again.
Thanks to tartle for sharing his experience with us on URL removal tool
I have used Google removal tool earlier and within 3-4 days Google has removed the pages which was listed in robots.txt file.
But now at the moment I am more skeptical about the URL removal tool as Google itself have mojor issues and bugs with it's latest update.
What I feel to do is now for a week is -
- Stop Link campaign
- Stop using URL removal Tool
- Deactivate Sitemaps Tool - if anyone is using it, it'e better to stop that for sometime.
- Stop all the major activites for Google & On Google.
Maybe it's better to take off to some cold place for a vacation as Matt is also on Vacation when webmasters need him - he is all enjoyin.
What does others has to say about it?
KaMran
If it worked any other way, then once you let a page go 404 and it was deindexed, then you would never be able to put a page online at that URL ever again.
Yes, but I don't think it is reasonable to crawl de-indexed pages who don't get any new links (and even the old links do not exist anymore).
It would be the same as crawling any random fictional page on a domain.
I've had a site gone for over 6 months and the entire web site is still in the index.
If it is just returned to the index, then it is in compliance to what Google claims:
Pages removed using our automatic URL removal system are excluded from our index for at least 6 months regardless of whether they become available to our crawler during that time.
[google.com...]
It could be related to new and existing links, although I am inclined to think that tolbar users' bookmarks could do some messing here.