Welcome to WebmasterWorld Guest from 54.145.209.107

Removing Pages From Google Using Webmaster Tools

   
12:01 pm on Apr 18, 2007 (gmt 0)

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Vanessa Fox has announced a new and additional way to remove content from Google's index.

As always, the answer begins: it depends on the type of content that you want to remove. Our webmaster help center provides detailed information about each situation. Once we recrawl that page, we'll remove the content from our index automatically. But if you'd like to expedite the removal rather than wait for the next crawl, the way to do that has just gotten easier.

For sites that you've verified ownership for in your webmaster tools account, you'll now see a new option under the Diagnostic tab called URL Removals. To get started, simply click the URL Removals link, then New Removal Request. Choose the option that matches the type of removal you'd like.

[googlewebmastercentral.blogspot.com...]

10:39 am on Apr 19, 2007 (gmt 0)

10+ Year Member



Has anyone tried this out yet?

I have a site that I have no idea how it got indexed (it is just an IP, no domain name!) and I want to get it out of Google. I put up a robots.txt to ban access to the whole site 3 weeks ago and it is very slowly disappearing. I figured this tool is perfect for just getting the whole of the rest of the site removed quickly - put in the request and it has been denied. How useful!

10:52 am on Apr 19, 2007 (gmt 0)

10+ Year Member



I just did for my main site.

I recently got hacked by some "jerk" that put some dodgy html pages in one of my folders, that redirected to absolute filth and built links to these pages. I really don't know how he got into my server.

I robots.txt them out of google but this is taking a while and have now submitted a removal of one of the pages to see what happens. It's currently pending removal.

Will let you know what gives when it is "complete".

[edited by: Pico_Train at 10:53 am (utc) on April 19, 2007]

12:28 pm on Apr 19, 2007 (gmt 0)

5+ Year Member



I have just used the tool to remove some duplicate content (60,000 pages) which my SEO company added to our site that was auto generated by searches made within the site. i.e some one searched for blue widgets then a script made a page dedicated to blue widgets. This has caused me no end of problems with duplicate content and has led to a drop of 90% of my google traffic.
6:08 pm on Apr 19, 2007 (gmt 0)

10+ Year Member



almost 10 hours later and still pending, hmmm, fast food counter this is not!

[edited by: Pico_Train at 6:09 pm (utc) on April 19, 2007]

1:15 am on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My first request was completed within 24 hours. Next two are still pending after 36 hours.

Just added another batch today.

3:13 am on Apr 20, 2007 (gmt 0)

10+ Year Member



This is still very buggy. I entered a few directories that are 404s and they were all denied.
8:17 am on Apr 20, 2007 (gmt 0)

10+ Year Member



Yeah, definitely a bit buggy.

I submitted a second site. All I did was put up a robots.txt excluding the whole site and then put in the request. I checked my logs carefully, there was an automated request for the robots.txt and that was it - 12 hours later and the request was dealt with.

Went back to my first site, made sure my robots.txt was correctly in place, submitted the request again, and it has now been denied once again.

8:25 am on Apr 20, 2007 (gmt 0)

10+ Year Member



like a charm, gone within 24 hours
10:24 am on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Are they gone just from the visible SERPs, and are they gone for just 6 months OR are they properly thrown away and marked "if they never exist again, never add them back"?

Time will tell.

10:34 am on Apr 20, 2007 (gmt 0)

10+ Year Member



indeed
3:07 pm on Apr 20, 2007 (gmt 0)

10+ Year Member



Google says that they are gone for 6 months, then if they still return a 404, they don't show in the results. But they don't say whether or not they are thrown away.
6:01 pm on Apr 24, 2007 (gmt 0)

5+ Year Member



I've submitted a removal for 2 directories and 4 individual pages. The directories were denied and the pages are pending. It will useful...once it works =/
2:00 pm on Apr 25, 2007 (gmt 0)

5+ Year Member



Well I finally got Google to remove my two directories with the url removal tool. It seems that even though the instructions say that you need to have either a 404 status code, robots.txt exclusion or meta tag no index, you need to have at least the robots.txt file excluding the content.

When I first tried to have my directories removed I only had a 404 status code. It wasn't until I also added the robots.txt exclusion that the url removal tool worked for the directories.

Of course my individual pages are still pending even with a 404 and robots.txt...

8:51 pm on Apr 25, 2007 (gmt 0)

5+ Year Member



I have the dreaded problem of duplicate content being indexed because of my https server. Now I have seen this new tool and added a new sitemap for the https site. Then I went into the URL removal tool and get to this point:

New Removal Request
Remove your entire site

Submitting this request will remove this entire site including all subdirectories from Google search results. Learn more
Are you sure you want to remove this entire site from Google search results: https://www.mysite.com/?

[] Yes, I want to remove this entire site.

Now obviously, I am very scared that if I "remove this entire site", that both my http and https versions of the pages will be removed, because both are on the same domain. Does anyone have experience with this particular issue?

8:56 pm on Apr 25, 2007 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Definitely do NOT use the removal tool to deal with an https issue. It will remove your entire site for 6 months, no matter which protocol is involved.

See this thread for more information on effective ways to approach the protocol problem:
[webmasterworld.com...]

9:10 pm on Apr 25, 2007 (gmt 0)

5+ Year Member



OK Thanks tedster for the quick reply.
10:31 pm on Apr 27, 2007 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Good call Tedster.

There have been multiple warnings that Google just looks at the domain that was requested and not at the protocol in the URL (http vs. https).

They all get deleted together.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month