| 5:51 am on Jan 4, 2006 (gmt 0)|
I'm in the same boat...before I added a sitemap, new pages would be cached and indexed within 2-3 days. I added new pages just before Xmas, and to this date no dice..
| 6:59 am on Jan 4, 2006 (gmt 0)|
cheers phish I thought it was just me, its odd I've never had problems in the past getting new pages crawled but lately it seems like hard work. I'm going to try moving the pages up a bit so they are only 2 pages from the index rather than 4 they currently are, but the daft thing in the past I could just reply on my own SM and they would normally get crawled within a few days now nothing - I think I'm going to knock this SM on the head
| 11:46 am on Jan 4, 2006 (gmt 0)|
I'm finding that Google seems to be checking for a new sitemap more often than I put up a new page - which only once a day. I don't know if it has affected crawl rate but I can be sure now that it has crawled the whole site.
Not sure how one could prove that a sitemap made crawling better or worse, I figure it can only make it better - and in fact generating a sitemap and watching the stats that G gives you about search keyword hit and misses and 404s - on its own is valuable information that makes a sitemap worth building.
| 12:36 pm on Jan 4, 2006 (gmt 0)|
I think the sitemap makes a difference only when a site is dynamic in nature and the session IDs were screwing up googlebot, otherwise it seems to be of little help.
| 5:39 am on Jan 5, 2006 (gmt 0)|
I have a client that has a dyno site and we tried the sitemap option.
Nothing positive or negative to report, no change within the last month.
I thought we would see something different but so far it is a big yawn.
| 6:14 am on Jan 5, 2006 (gmt 0)|
I'm not sure if it's worth it. I would like to think it is.
I launched a mini-site (10 pages) on Dec 14th. It was crawled on Dec 24th and Jan 1st. It is now about 20 pages.
Googlebot visits twice daily at precisely at 1:02am and 1:03pm. But the Googlebot only looks at the map file during those visits. It visits other pages on a random basis too.
I applied Google sitemaps to an older site, because Google Search refuses to find 16 pages that I added. The sitemap addition didn't help.
It crossed my mind that Google just wants to know who owns which sites, because the tools in the sitemaps interface did not help me that much, after I went to the trouble of setting it all up.
So, yes, you are left wondering is there something in it for me? My gut feel is yes - but it is difficult for me to quantify at this time.
| 8:49 am on Jan 5, 2006 (gmt 0)|
as it was free I wanted to give it a shot but TBH I'm not doing it again as I really think for static sites its a waste
thanks for the feedbacks
| 10:49 am on Jan 5, 2006 (gmt 0)|
I'm working on the premise that it will help get new sites up and running quicker. Might be wrong but at the moment I can't think of any other use
| 11:24 am on Jan 5, 2006 (gmt 0)|
I also have mixed feelings about sitemap,
However i have noticed something strange, i have a standard navigation bar in the left hand side of my page, because the bar is shown accross the site in the root and sub folders all links are www.domain.co.uk/page.asp and so on, i have noticed in my errors list that G appears to be taking no notice of the www.domian.co.uk and just looking for the /page.asp! why is this
am i doing something wrong?
| 11:31 am on Jan 5, 2006 (gmt 0)|
sorry steve can't help I only work with static sites - hopefully one of the other members will be able to help you
| 4:01 pm on Jan 5, 2006 (gmt 0)|
How many here are resubmitting their sitemap.xml file every time they change it?
You must initially add your Sitemap to Google Sitemaps using your Google Account. When your Sitemap changes, you can resubmit it to Google to let us know. You can resubmit your Sitemap in one of two ways:
# sign into Google Sitemaps with your Google Account and from the Sitemaps tab, select the checkbox for the Sitemap and click Resubmit Selected.
# send Google an HTTP request
It seems silly to do this considering Google keeps reading the sitemap.xml file constantly, but I'll have to start being more thorough with resubmitting.
I'm having the same problem with new pages taking forever to be indexed and it could be simply not resubmitting is the problem.
Next question: Is there an excessive resubmission penalty?
I'd rather just "ping" daily whether the sitemap.xml file has changed or not.
| 6:26 pm on Jan 5, 2006 (gmt 0)|
I know that initially it helped me out on one site in particular. However, one thing that I'm not sure any of us are keeping in mind is that Google uses data differently on an ever changing basis.
The one six month old 1000 page site that I added it to not too long ago noticed a slight instant improvement in time to index of new content. But, now that the site is very established I'm not too sure what good it's doing now.
I think for a new site it's an outstanding tool and also for a site that is older but not well indexed. If you're totally satisfied with Google's indexing of your site then there's no real reason to use Google Sitemaps... Yet.
| 6:27 pm on Jan 5, 2006 (gmt 0)|
I added a sitemap last september and have see zero difference in any aspect of new pages getting indexed.
I have however found a couple of errors through the stats it shows but not many and I suppose Id have found them in my logs, so all in all I found it to be a waiste of time and never bothered adding it to any other sites I have.
| 6:33 pm on Jan 5, 2006 (gmt 0)|
djmick200 did point out something that I found very helpful.
When creating the sitemap I found a few troublesome errors that I was able to correct as well as some new things I should add to my robots.txt file.
It was at least useful for that in my opinion.
| 6:37 pm on Jan 5, 2006 (gmt 0)|
For me, sitemap is a great tool.
Google, although can, often abandons thread crawling after a few messages of my forums.
With sitemap, all of it is taken, at the appropriate time.
| 9:17 pm on Jan 5, 2006 (gmt 0)|
Folks - In my last post I said I was not able to quantify improvements, well some thing has changed since then.
I don't follow the Google update threads too well. Too much noise for me. But it looks like there is BigDaddy data center and on [188.8.131.52...] I searched on site:www.mydomain.com and my sites are lighting up like Christmas trees.
(I have been wondering what [184.108.40.206...] was for the last two weeks - it was turning up in my logs)
The new site I mentioned earlier has all pages included.
The old screwed up site has all pages indexed too!
My oldest widget site lost some ground in the serps, and I think I know why.
My newer widget site 3000 pages had only the first page indexed by Google now it has 400 and is actually showing up in the Serps (wow). I had almost written this site off for Google.
So,if the above datacenter is the new standard, I would be extremely hesitant not to use Google Site maps.
Of course, the recent improvement may be due more to algo elements than sitemap tools, but am I going to take that chance - I don't think so.
| 10:11 pm on Jan 5, 2006 (gmt 0)|
|(I have been wondering what [220.127.116.11...] was for the last two weeks - it was turning up in my logs) |
I can't edit my previous post for some reason. But FWIW my above statement is incorrect. These were the referrering IPs.
They reference "Google English" just like the above DC. Does anybody know what Google English is?
From Webalizer Referrals:
| 11:17 pm on Jan 5, 2006 (gmt 0)|
I too have noticed slow indexing of pages after putting a SiteMap in one of my website.
Should I stop using it?
The only disadvantage will be that I will not be able to check my error pages which Googlebot shows in SiteMap.
What do you suggest?