Forum Moderators: open
When logging at [services.google.com...]
all the urls I have previously removed (404 or robots.txt disallowed) now have the status "request denied"
And when doing a search, these pages (or ppt files) are now appearing again on google :(
I also tried to remove a few additional urls using the meta tag method, but all these urls are rejected. I get a "We could not detect any meta tags" message. Is this because I have used a googlebot meta tag instead of a standard robots meta tag? Or maybe because the urls I submitted are dynamic pages?
I would make sure that you've done everything that is stated on that page for removal, and if that doesn't work, i would try emailing them or sign up for another account it might help, have you ever had a url removed from your account?
Yes, I'm sure that I have done everything right for removal.
Except in the case of pages disallowed by googlebot metatag, but it would be strange Google don't recognize their own metatag...
> have you ever had a url removed from your account?
Yes they were previously removed. I had received approval confirmations. But now they are reappearing in the index...
From Google's Remove Page:
If you do not have access to the root level of your server, you may place a robots.txt file at the same level as the files you want to remove. Doing this and submitting via the automatic URL removal system will cause a temporary, 90 day removal of your site from the Google index. (Keeping the robots.txt file at the same level would require you to return to the URL removal system every 90 days to reissue the removal.)
- the pages I previously removed via the url console.
All theses pages are 404 or disallowed with robots.txt (I have a root access)
The removal succeed first, but after about a week they are back in the index...
- the pages I'm trying to remove now
This time, I have used a meta tag : <meta name="googlebot" content="noindex">
Why a meta tag and not adding these pages to my robots.txt?
Because the urls looks like this : -http://www.monsite.com/forum/viewtopic.php?TopicID=262
So I can't exclude these pages via the robots.txt method
I can't submit these pages for removal via the url console. They are automatically rejected. Do you think this is because they are dynamic pages? or because I used a tag excluding only Google instead of the meta robots normal tag?