Welcome to WebmasterWorld Guest from 54.162.226.212

Message Too Old, No Replies

Why is googlebot asking for a directory that doesn't exist anymore?

     
7:00 am on Oct 3, 2006 (gmt 0)

10+ Year Member



Hi,

why googlebot always spider currently no exist directory?

Beofore we have a links directory then we delete that directory about seven months ago. But googlebot always spider that directory until now.

Such behavior of googlebot has any side effect? Is this the reason our site pages SERP always go down for many kws? Should I ban the googlebot to spider that no exist directoy at robots.txt?

12:34 pm on Oct 3, 2006 (gmt 0)

10+ Year Member



anybody have any idea for this?
12:40 pm on Oct 3, 2006 (gmt 0)

10+ Year Member



Usually happens to me when I have left some link on a page buried somewhere.
1:38 pm on Oct 3, 2006 (gmt 0)

10+ Year Member



before I have a links directory then I delete that links directory over seven months already, but robots still want to "eat" that directory, so have a lot of 404 error and I don't know how to do now. I don't know if it will affect the site ranking.
2:17 pm on Oct 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Assuming dramstore is right (which I would!), you'd want to start by looking for pages that still have links to that vanished directory.

Then you'd want to remove any links you found.

2:52 pm on Oct 3, 2006 (gmt 0)

5+ Year Member



I have the same problem.

There are hundreds of pages listed in google that haven't existed for at least a month. They are not listed in my sitemap.txt and I am 99.999% certain that there are no links to them from any other pages on my site.

I am trying to recover my hits, which have recently plummeted to almost zero after about 10 good years.

11:48 pm on Oct 3, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Google will spider every URL that they have ever seen to see if the status of that URL ever changes again.

This is not a problem.

A problem occurs if they start showing and ranking that page in the normal index again.

They do often show 404 pages in the Supplemental index for a year after the content is removed.

Continue to serve a 404 for that URL.

11:58 pm on Oct 3, 2006 (gmt 0)

10+ Year Member



The problem is the robot query too often for those removed directory and pages, sometimes several hundred times per day. Looks like they want those pages first before spider other pages.
12:02 am on Oct 4, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Make sure that they really do get served a 404 status code for each one.
12:04 am on Oct 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could always add the non-existant directory to your robots.txt and block it from being scanned.
12:18 am on Oct 4, 2006 (gmt 0)

10+ Year Member



Yes, every time got 404 for the delted directory and files.

Now I have three ways to deal with the issues,

1) find the links to those deleted directory and files and ask webmasters to remove those links, this costs much time to do so.

2) Block the deleted directory and files at robots.txt, but have any side effects to do so? the robots maybe will think why you don't allow me to spider those pages while links to those directory exist.

3)don't care about those to let the robots got 404. But have any side effect to do so? Will it affect the whole site ranking?

Pls kindly advise which way is the best way.

12:37 am on Oct 4, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Just let all requests for urls in that directory be 404 and let it alone. No side effects that I can think of, just some lines in your server error log.

But here's another thought: if there are a lot of links to pages in that directory, then maybe you want to place some different content on those urls instead of just letting them all be 404. Last year I took over a site that (years before) had removed heavily backlinked url. As soon as I noticed this, I put new content at that url and it showed up at #9 for a one-word keyword search after just a couple days.

2:24 am on Oct 4, 2006 (gmt 0)



it could be that someone still links to that page
 

Featured Threads

Hot Threads This Week

Hot Threads This Month