Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Big Daddy exonerated for my dropped URLs -- maybe

Looks like googlebot "misread" my robot.txt on April 11

         

montefin

11:17 pm on Apr 30, 2006 (gmt 0)

10+ Year Member



Since April 11, I've been puzzling over the disappearance of one of my highest trafficked - and highest earning - URLs from Google SERPs.

Okay, Google referrals have been down generally of late for me, and I've been accepting it as Big Daddy flux. But, why did that particular URL vanish completely from the SERPs?

Today, I was familiarizing myself with Google sitemap diagnostic tools in preparation for submitting a sitemap. When I clicked on a link for "URLs restricted by robots.txt".

B-I-N-G-O -- up popped my missing URL and 2 others.

Below that link was a link for "robots.txt analysis", so I clicked that, too.

It brought up their cached copy of my robots.txt:

User-agent: *

Disallow:

I entered the suspect URLs in the space provided and clicked the "Check" button.

They all showed "Allowed".

So I checked the Robot META tags on those files:

<META NAME="robots" CONTENT="index,follow">

That looked okay.

Has anyone else experiencved anything like that? And if yes, what steps did you take to correct the situation?

And, of course, does anyone see anything in my robots.txt or Robot META tag that may have caused it?

tedster

11:24 pm on Apr 30, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



B-I-N-G-O -- up popped my missing URL and 2 others.

OK, so it's not Big Daddy per se, it's Sitemaps?

User-agent: *

Disallow:

This sounds like a very strange Sitemaps error. Clearly that robots.txt allows everything to be spidered -- unless perhpas there is some dynamic creation of robots.txt going on and you are not seeing what googlebot sees.

montefin

11:46 pm on Apr 30, 2006 (gmt 0)

10+ Year Member



tedster wrote:

"OK, so it's not Big Daddy per se, it's Sitemaps?"

Not so sure of that. I have not submitted a sitemap yet. It could be something else entirely.

I'm just grateful that the sitemap diagnostic tool solved my puzzle. But now, how do I correct the problem?

And, my robots.txt file is a plain old text file.

montefin

2:05 am on May 1, 2006 (gmt 0)

10+ Year Member



Just found this on Google's sitemap blog:

"Updated robots.txt status
4/26/2006 03:37:00 PM

Posted by Vanessa Fox, Google Engineering

Thanks to our users for alerting us to an issue with incorrectly reporting that sites and Sitemaps were being blocked by robots.txt files. We have resolved this issue. If you were unable to add a site or Sitemap because of this issue, you should now be able to add them.

If Sitemaps was reporting that your home page was blocked by robots.txt, you should soon see an updated status. Thanks for your patience as we refresh the display of this data.

If we were incorrectly reporting that your robots.txt file was blocking your Sitemap, you will see an updated status the next time we download your Sitemap.

Thanks as always for your valuable feedback."

texasville

6:52 pm on May 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This thread prompted me to go ahead and go the g sitemaps route and I found no errors. But G still is not indexing all my pages. About half. I guess we'll see if uploading them their own little custom site map will improve things. But I doubt it. I have two site maps in different configurations on my site already.