Forum Moderators: open
If you ever mistype a URL in a link, then Google will index that typo, but will never be able to follow it. A Google search for site:.pdf will find thousands of links that Google has seen where they could not then access the site mentioned (most of those have a trailing / after the end of the filename that should not be there). It is also almost impossible to find out what site that typo is actually on, so that it can be notified and corrected. That is why you should run a link checker over your whole site every few weeks, or after any major navigation or content update.
Also, one of the major backbone providers had some major problems with their network late at night about a week ago, it is possible that this may have caused some problems for Googlebot.
Should the home page be named index.html? I've never heard of this being a problem.
--
I've had the same no title and no description thing for a very long time. It has nothing to do with Google not visiting those pages yet. All of my pages were indexed before this problem occurred.
I honestly think that it is a Google problem. Google sent me an email saying it is because the pages are only partially indexed.
What is the use of partially indexed pages? There is none other than saving bandwidth and being able to claim an exagerated amount of indexed pages.
I never click on pages with no description or title. They are essentially useless pages. Nobody wants to see a link in their search results; they want to see a Title and Description. That's why I don't buy the partial indexing excuse that Google gave me.
Thus, I think these are the most likely possibilities: 1.Google is broken
2.Google is trying to win the largest index race by providing useless results from partial indexing
3.Google has some really dumb geniuses working for them that think a link without Title and Description is useful
2004-06-06 03:36:38 64.68.82.184 W3SVC98951 /robots.txt - 404 4184 0 Googlebot/2.1+(+http://www.googlebot.com/bot.html)
2004-06-06 03:36:39 64.68.82.184 W3SVC98951 /index.html - 200 19262 485 Googlebot/2.1+(+http://www.googlebot.com/bot.html)
I have uploaded a blank robots.txt file.
This really is strange - most unlike Google!
Kiwi seems to be saying that from looking at their logs, Googlebot has not been around as much as they would normally expect.
If memory serves me correctly, the Internet backbone problem happened a couple of days after Googlebot last showed-up in their logs. I'm wondering if Googlebot tried to spider their site during this outage and wasn't able to reach it.
I would also be keeping a close eye on the hosting and make sure it is staying up.
In any case, the fact the Googlebot seems to be AWOL in their server logs at the same time that their listing details disappeared would seem to confirm the theory that Googlebot was unable to succesfully access their web page.
I would just wait for it to come by again, and your listing should be back up soon thereafter.
Has anyone had experiences of this where their site has come back to normal and how long did it take?
Yes, I experienced this recently, there was a programming error on my site that only triggered when a non windows os, like googlebot, visited the site. However, google showed the error code it got for a week or two, then my site vanished from the serps, I fixed the error and then it took about 2 weeks or so to get back in.
Check your page with something like this site's http viewer [seotoolkit.co.uk], it shows you what the googlebot sees when it visits your site, more or less.
Or your site could have been down as others suggested, which would I assume have the same result, minus the error codes.