Forum Moderators: Robert Charlton & goodroi
I deleted my entire site from google index, 97 days ago because of some reasons:
1- some of my urls was detected by google that were generated by my site's users.
2 - the count of this urls was very large (more than 280 000), and i could not delete them with single url removal tools in google webmasters tool.
Now, i changed my robots.txt and all of undesired urls are blocked by robots.txt, i want to see my other pages in SERP (Search Engine Result Page), i also add sitemaps to google webmaster tools, and google now starts to crawl to my site, 97 days are past and i see always the Re-include button and 'Site Removal' is showed with 'Removed'.
i know that after 90 days, google reinclude automatically urls, but this is not that i can see in SERP when i google with 'site:www.mydomain.com'
i am not sure that if i click on 'Re-include' button, whether all my URLs will be back (desired and undesired), can you help me please? what can i do for my urls back to index?
Thank you very much
I want to keep the urls blocked in robots.txt working on my site without any change, but i do not want, google index them, and now they are deleted, also my other pages are deleted like my homepage, and when i search in google 'site:www.mydomain.com' , there is no result. 97 days is past, and my urls are not back, i must wait more or i can click on 'Re-include' button?
Deleting all content to get rid of just some content was pretty draconian :-)
i reinclude my entrie site, and now all of pages are back to google's SERP, but more than 5600 unwanted page is back to index now, all of this pages are url like this:
http://www.example.com/mypage.asp?para=(some string)
after my site is back, i'm tring to remove 5600 page one by one, unwanted pages are blocked in robots.txt from 115 days ago but they are reincluded.
and now my questions:
1 - is that another way for removing pages that are similar in url?
2 - why after adding 900 url to google removal tool, it does not accept more, and an error is appeared like this:
*There was an error processing this request. Please try again*
Thank you
[edited by: tedster at 4:19 pm (utc) on April 4, 2009]
[edit reason] switch to example.com [/edit]
disallow: /mypage.asp?para*
If above code already in robots.txt and do not hurry as Google many a times shows old URLs and report them as errors which might reduce (completely go) after sometime (even months).
This is exactly my case ***If above code already in robots.txt and do not hurry as Google many a times shows old URLs and report them as errors which might reduce (completely go) after sometime (even months).***
google says if you block the urls by robots.txt, this pages will not be crawled, but the urls are showed in SERP without Anchor and cache for months, but if you allow urls in robots.txt and add a meta tag, noindex, they are removed permanently after first crawl, and now i want to delete again my entire site, and allow urls in robots.txt, and adding a noindex tag in pages, after 11- 15 days, i will reinclude entire site, i hope during this time all of pages were crawled at least one time and were deleted forever, after this my bad pages will been deleted permanently and my other pages reappears in SERP.
Is this method correct?