Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
I spent 20 of my time twice, on e-mail communication with the company who pretty much told me to go play because everything is automated, both replies seem like they were automated as well.
What have I try to do to be “friends” with them? - everything but the bad Webmaster behavior.
So if you care to comment on how to remove the contents of my website from Google Index ASAP, please let me know.
1. Ban Google from crawling any part of your site anymore. You can easily do this creating/editing your robots.txt file located in your site's public root directory: www.yoursite.com/robots.txt
In that robots.txt file, enter the following lines:
2. You will need to surf to G$'s remove URL:
Be careful to ignore G$'s suggestion there to make your robots.txt file to DISALLOW all user-agents. They are just trying to make sure that, if you are banning them, G$, in their typically-evil fashion, simply want you to quit other SEs too. Since you are not trying to ban any other SE, you do not need to do that. Just use the above DISALLOW-code to ban only googlebot. If G$ was truly a "do no evil" company, they would give those above instructions to only ban them, but what can you expect from such a never-ending evil company, anyway?
Anyway, that page at G$ will then have a box asking if it is URGENT to remove your site. If you are trying to remove it, then of course it is. Don't let G$'s scare-tactic there about URGENCY deter you. Simply click that link to:
G$'s automatic URL removal system
If you do not have a log in account at G$, you will first need to create one, wait for G$ to email you with access, and then you will be ready to enter your url-removals as instructed from there.
They will then remove it usually within the next 24 hours. You may follow the status of that removal-request at any later time, by re-logging in to G$ through those same URLs.
Real reason we want to remove our site from their index is the part of their reply to our request to remove any restrictions when it comes to having our site to be displayed in the search results.
First reply from them
The best way to ensure that your site will return for your preferred keywords is to include them on your pages. Our crawler analyzes the content of webpages in our index to determine the search queries for which they're most relevant. If you create an information-rich site that clearly and accurately describes your topic, it's likely that your site will return for your desired keywords.
Second reply from them
Thank you for your reply. Please note that our search results change regularly as we update our index. Normal changes you observe may include, but are not limited to, addition of new sites, changes in the ranking of existing sites, sites falling out of the index or getting dropped for particular keywords, and fluctuation between old and new webpage content.
We realize these changes can be confusing. However, these processes are completely automated and not indicative of wrong-doing or penalization of individual sites. We currently include over eight billion pages in our index, and it is certainly our intent to represent the content of the internet fairly and accurately.
We are not an affiliate store, every product page has a unique description and unique by design. Yes we do have between 500 - 600 products to offer, the only sites that have more are wholesalers, and that makes us a perfect candidate for AdWords. If searched for phrase - standard widgets - we are not in first 1000(not that you could see beyond that), but if searched for "standard widgets shopping" - our FAQ page comes up second and third at the same time.
Sorry it all sounds like "Got Dropped by Google" stuff, but it just makes me wonder why I went to school at all....
I use a custom 404 page so that if I delete soemthing or some other file is missing then the customers will get a handy menu so they can click back into my site instead of a scary server error message.
When you use the URL removal tool it goes out and looks to see if the page is still there - it could care less about the content on it, just that it's there. When the custom 404 page comes up it thinks it found the valid content and refuses to delete the page (probably to protect you from sphincters trying to delete your site for you).
The only way I could get it to remove my pages from the index was to disable my custom 404 page for a couple days, put in the delete request and wait till the pages no longer showed up in the SERPS. I even sent an E-mail to Google's technical support telling them it wouldn't let me delete the page and I needed help. The friendly customer service person argued with me that the pages were still there, even though he was looking at a page that had WHOOPS on it in huge letters and said the requested page was no longer present.
So if you have a custom 404 file, beware when trying to delete pages.
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
Then submit your pages for removal and it all should work.
That way, you do not have to be without your special 404 error page.