Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google not indexing pages even though Sitemap is working

         

sparklingway

3:05 am on May 3, 2009 (gmt 0)

10+ Year Member



The sitemap is working fine and has submitted 25 URLs to Google. Some of them of course refer to categories but there are 8 posts as well. The last post was a week ago and google is not showing them even with the site: operator.

How can this happen? If sitemap has submitted the pages to google, how can they not be appearing in the results? The sitemap has submitted 25 URLS, listed pages are only 8. And they are not increasing.

Should I contact Google support?

tedster

3:30 am on May 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What exactly is happening in your case may have several possible causes - but one may be that Google does not add every page they spider to the index. This is true from the smallest site through the largest, even those with a PR9 or PR10 home page.

Are you getting feedback about your sitemap urls from Webmaster Tools? And if so, does that report agree with the googlebot spidering entries you see in your server logs?

sparklingway

11:59 am on May 4, 2009 (gmt 0)

10+ Year Member



Googlebot is not seeing any posts

tedster

5:39 pm on May 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How many backlinks point to your site - to any url at all? If that number is low it can restrict how much of a site Google actually indexes.

Googlebot is not seeing any posts

Do you mean there is not even a request in your server logs from googlebot? Or that the response your server gives is something other than "200 OK"?

sparklingway

7:20 am on May 5, 2009 (gmt 0)

10+ Year Member



"The data is not available". This is what the Googlebot is seeing

The pbolem is, from my webmaster page:

"Sitemap stats
Total URLs: 15
Indexed URLs: 4"

new posts aren't being indexed even though they are being submitted

Google Snalytics seems to have stopped working as well. The plugin is working. The status is fine from the Analytics dashboard. Yet, time appears to have stopped on April 27, 2009. No updates from Analytics from that day.

No URLs have been indexed after that day as well. What could possibly have happened?t

AnkitMaheshwari

7:30 am on May 5, 2009 (gmt 0)

10+ Year Member



Check your Robots.txt file once.

sparklingway

7:40 am on May 5, 2009 (gmt 0)

10+ Year Member



Robots.txt

User-agent: *
Disallow:

[edited by: tedster at 8:04 am (utc) on May 5, 2009]

chakde

9:37 am on May 5, 2009 (gmt 0)

10+ Year Member



Hi,

I am new and it might be stupid question.

Can someone help to know how we can check the googlebot visiting the server?

"does that report agree with the googlebot spidering entries you see in your server logs? "

Thanks in advance

tedster

10:07 am on May 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You either need to access your raw server logs and run your own analysis, or use a log analysis tool that shows you googlebot's visits (and slurp, msnbot, askbot etc.) If server logging is set up properly, you can see all the requests and how your server responded. This can be essential for debugging some problems.

rustybrick

10:55 am on May 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How new is your site?

sparklingway

11:19 am on May 5, 2009 (gmt 0)

10+ Year Member



Its around three weeks old now.
Hasn't ever happened to any of my other sites

tangor

12:12 pm on May 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Three weeks might not be enough time. Sounds like blog/forum/cms site (posts). Is there a possible problem with duplicate content? That's why you need to examine your server logs.