Forum Moderators: Robert Charlton & goodroi
The site BTW is a static small site 300+ pages
Not sure how one could prove that a sitemap made crawling better or worse, I figure it can only make it better - and in fact generating a sitemap and watching the stats that G gives you about search keyword hit and misses and 404s - on its own is valuable information that makes a sitemap worth building.
I launched a mini-site (10 pages) on Dec 14th. It was crawled on Dec 24th and Jan 1st. It is now about 20 pages.
Googlebot visits twice daily at precisely at 1:02am and 1:03pm. But the Googlebot only looks at the map file during those visits. It visits other pages on a random basis too.
I applied Google sitemaps to an older site, because Google Search refuses to find 16 pages that I added. The sitemap addition didn't help.
It crossed my mind that Google just wants to know who owns which sites, because the tools in the sitemaps interface did not help me that much, after I went to the trouble of setting it all up.
So, yes, you are left wondering is there something in it for me? My gut feel is yes - but it is difficult for me to quantify at this time.
[google.com...]
You must initially add your Sitemap to Google Sitemaps using your Google Account. When your Sitemap changes, you can resubmit it to Google to let us know. You can resubmit your Sitemap in one of two ways:
# sign into Google Sitemaps with your Google Account and from the Sitemaps tab, select the checkbox for the Sitemap and click Resubmit Selected.
# send Google an HTTP request
It seems silly to do this considering Google keeps reading the sitemap.xml file constantly, but I'll have to start being more thorough with resubmitting.
I'm having the same problem with new pages taking forever to be indexed and it could be simply not resubmitting is the problem.
Next question: Is there an excessive resubmission penalty?
I'd rather just "ping" daily whether the sitemap.xml file has changed or not.
The one six month old 1000 page site that I added it to not too long ago noticed a slight instant improvement in time to index of new content. But, now that the site is very established I'm not too sure what good it's doing now.
I think for a new site it's an outstanding tool and also for a site that is older but not well indexed. If you're totally satisfied with Google's indexing of your site then there's no real reason to use Google Sitemaps... Yet.
I have however found a couple of errors through the stats it shows but not many and I suppose Id have found them in my logs, so all in all I found it to be a waiste of time and never bothered adding it to any other sites I have.
I don't follow the Google update threads too well. Too much noise for me. But it looks like there is BigDaddy data center and on [66.249.93.104...] I searched on site:www.mydomain.com and my sites are lighting up like Christmas trees.
(I have been wondering what [66.249.93.104...] was for the last two weeks - it was turning up in my logs)
The new site I mentioned earlier has all pages included.
The old screwed up site has all pages indexed too!
My oldest widget site lost some ground in the serps, and I think I know why.
My newer widget site 3000 pages had only the first page indexed by Google now it has 400 and is actually showing up in the Serps (wow). I had almost written this site off for Google.
So,if the above datacenter is the new standard, I would be extremely hesitant not to use Google Site maps.
Of course, the recent improvement may be due more to algo elements than sitemap tools, but am I going to take that chance - I don't think so.
(I have been wondering what [66.249.93.104...] was for the last two weeks - it was turning up in my logs)
I can't edit my previous post for some reason. But FWIW my above statement is incorrect. These were the referrering IPs.
They reference "Google English" just like the above DC. Does anybody know what Google English is?
From Webalizer Referrals:
[64.233.167.104...]
[72.14.203.104...]
[66.102.7.104...]