| 9:40 pm on Dec 1, 2005 (gmt 0)|
| 9:55 pm on Dec 1, 2005 (gmt 0)|
As anyone curious would ..reading your post I went to google news to see if I could discern what it might be ..
and straight away got side tracked by the following article ..which google have filed as their first subject under the heading of
Reformed gang leader awaits death
BBC News - 3 hours ago
By Alistair Leithead. On Tuesday, 13 December the co-founder of one of the world's biggest gangs, a man convicted of murdering four people, will be led into a small room in the depths of San Quentin prison in San Francisco Bay.
they really have lost the plot at the plex if that is "entertainment" ..
| 10:43 pm on Dec 1, 2005 (gmt 0)|
>>>I already put the usual noindex, nofollow and noarchive on the page...
This only stops bots from updating the page, but it will stay in the cache virtually forever.
>>>and used the Google Url remove tool, but it is still there on the Google News cache.
The remove tool does not work unless you put a robots.txt file with a Disallow command in the root directory of your server. You can try this, since it will only stop Googlebot temporarily while the page is being removed, then take it back out to let the bots in again. Just be careful that you only specify the exact page you need removed, or every page in your site will be removed from Google for 180 days.
Also, don't let your competitors know that you are doing this, since they can request a complete removal while the robots.txt file is in place (this may be what happened to WebmasterWorld after Brett made his “war on bots” public).
Finally, put up some page that you can allow indexed, because the cache may return in 180 days if you don’t give the bots something else to read in the interim. And, this is the fastest way to get things cleaned up in all of the other SEs that may have already picked up that article...
Best of luck on that one - and please feel free to sticky me if you need more advice...
| 10:50 pm on Dec 1, 2005 (gmt 0)|
Don't know how to get it removed - but you should change the landing article to something which denies everything!
| 2:01 pm on Dec 2, 2005 (gmt 0)|
I would suggest using a 301 redirect to some other page (maybe the index page) so that anyone landing on the article can't actually read it. Google News does not provide a user-readable cache of news pages.
| 4:47 pm on Dec 2, 2005 (gmt 0)|
I agree. Why not do a redirect or change the page. That would be the best. Hardly anyone out there cares about cache pages except for maybe us SEO nuts in here.
| 5:13 pm on Dec 2, 2005 (gmt 0)|
You don't need to use robots.txt for G's url removal tool [calendar.google.com] to work. Using meta robots is fine and, to me at least, is the safest way to remove one or a handful of pages.
| 10:41 pm on Dec 5, 2005 (gmt 0)|
Change the content of the URL because I know G likes to refresh content more than it does adding or taking away pages.
| 6:00 pm on Dec 6, 2005 (gmt 0)|
In fact, some pages never seem to go away. I know I am still trying.
| 6:12 pm on Dec 6, 2005 (gmt 0)|
I wouldn't do a redirect. If it is cached, doing a redirect would cause people (granted, only nuts like us, as mentioned above) to look for the cached page. Just change the content.
| 10:52 pm on Dec 6, 2005 (gmt 0)|
no no no change the content of the page if speed is an issue.... trust me its quicker!
| 1:02 am on Dec 7, 2005 (gmt 0)|
|Finally, put up some page that you can allow indexed, because the cache may return in 180 days if you don’t give the bots something else to read in the interim. And, this is the fastest way to get things cleaned up in all of the other SEs that may have already picked up that article... |
I've seen a couple of mentions about this. How does that work exactly to get them removed?
I've got a large number of pages I removed with their removal tool about 5 1/2 months ago after rearranging my site. So, I'm expecting them to reappear soon. I have a 410 redirect in place and was expecting that would sort them out when they reappear. However, I never see this method recommended as a solution to solve this problem.
Could someone comment on if the 410 redirect will permanently remove these pages, and if not, why not. And then explain how putting up a replacement page which can be indexed solves the problem? What to do after they've been reindexed, for example? Is a "Hello World" page all that is required for the indexable page, or what?