Welcome to WebmasterWorld Guest from 22.214.171.124
why googlebot always spider currently no exist directory?
Beofore we have a links directory then we delete that directory about seven months ago. But googlebot always spider that directory until now.
Such behavior of googlebot has any side effect? Is this the reason our site pages SERP always go down for many kws? Should I ban the googlebot to spider that no exist directoy at robots.txt?
There are hundreds of pages listed in google that haven't existed for at least a month. They are not listed in my sitemap.txt and I am 99.999% certain that there are no links to them from any other pages on my site.
I am trying to recover my hits, which have recently plummeted to almost zero after about 10 good years.
This is not a problem.
A problem occurs if they start showing and ranking that page in the normal index again.
They do often show 404 pages in the Supplemental index for a year after the content is removed.
Continue to serve a 404 for that URL.
Now I have three ways to deal with the issues,
1) find the links to those deleted directory and files and ask webmasters to remove those links, this costs much time to do so.
2) Block the deleted directory and files at robots.txt, but have any side effects to do so? the robots maybe will think why you don't allow me to spider those pages while links to those directory exist.
3)don't care about those to let the robots got 404. But have any side effect to do so? Will it affect the whole site ranking?
Pls kindly advise which way is the best way.
But here's another thought: if there are a lot of links to pages in that directory, then maybe you want to place some different content on those urls instead of just letting them all be 404. Last year I took over a site that (years before) had removed heavily backlinked url. As soon as I noticed this, I put new content at that url and it showed up at #9 for a one-word keyword search after just a couple days.