homepage Welcome to WebmasterWorld Guest from 54.161.192.130
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 38 message thread spans 2 pages: 38 ( [1] 2 > >     
Has using a Google XML Sitemap helped you?
aaronjf

10+ Year Member



 
Msg#: 33664 posted 12:34 am on Mar 29, 2006 (gmt 0)

I am in the final phases of development/testing of a new cart/CMS for our site. Amongst other "new" features to our site I am adding a XML Site Map for Google. Has anyone tried this and has it really helped Google better/faster index your dynamic site?

As we have been on a static system for the last 4 years and have a 3500 + page commerce site I am a litte concerned about the transition to a dynamic system. So, I am trying to do everything I can to make it as painless as possible.

 

The_Tank

5+ Year Member



 
Msg#: 33664 posted 5:55 pm on Mar 31, 2006 (gmt 0)

We have used the google site map for about 6 months - our traffic has recently crashed - [webmasterworld.com...] - so I cannot say if it helped traffic or not. One thing i did find is that in essence the google site map acted like a great big validation device and told me if the site was generating bad urls in strange locations - which is possible with large dynamic sites. As a result we implemented some new url rewrite technology and tided up some of our code.

So at the very least its a good validation tool for your site.

aaronjf

10+ Year Member



 
Msg#: 33664 posted 6:15 pm on Mar 31, 2006 (gmt 0)

Man, is it just you and I around here or what... Doesn't seem like many people have implemented this. I would have thought everyone would have jumped on this bandwagon.

That is interesting about using it as a validation tool. Never thought of that.

Wally_Books

5+ Year Member



 
Msg#: 33664 posted 12:01 am on Apr 1, 2006 (gmt 0)

I've only been uploading a sitemap for 6 weeks and it has not helped at all. Google looks but no cigar.

cgchris99

10+ Year Member



 
Msg#: 33664 posted 4:21 am on Apr 1, 2006 (gmt 0)

I too have a google xml sitemap on my site. The site has totally tanked since BD. Most of the indexed pages are supplemental.

aaronjf

10+ Year Member



 
Msg#: 33664 posted 6:18 am on Apr 1, 2006 (gmt 0)

The site has totally tanked since BD.

BD? Has Google been looking at the pages? Did you switch from a static to Dynamic site?

This is kind of scaring me since we are going form a 3500+ page 4 - 5 year old site to a 99% DB driven php site.

The_Tank

5+ Year Member



 
Msg#: 33664 posted 8:40 am on Apr 1, 2006 (gmt 0)

for a php site I presume you are going to be using the apache web server. If this is the case you just need to use the mod rewrite module and your dynamic site would look just like a static site. Used carefully you could make the whole site look static with no php extensions. Google does not need to know which server technology you use.

ZoltanTheBold

5+ Year Member



 
Msg#: 33664 posted 9:39 am on Apr 1, 2006 (gmt 0)

I'd have to say it has be no real help. I think it may be useful if you find there are many pages in your site that Google misses, and it at least gives you some method to let them know they exist. In all honesty though if this is the case you'd be better off fixing the problem on your site (lack of links, JavaScript/Flash links or whatever the problem might be).

I have a small (<500 pages) site, all static pages. Therefore I find the overhead of updating the sitemap not really worth the effort. It also doesn't speed up indexing or even crawling. I updated a page about 7 days ago and Google haven't indexed it, yet MSN and Yahoo have, without the benefit of a sitemap.

Like much of what Google attempts beyond strict searching I find the whole approach very clunky. For example it is an XML format, why not just a CSV? Also if you use multiple sitemaps like I do there is no indication when the sub-sitemaps are crawled, so it's frustrating.

I've had the sitemap since my site went online, so I can't say if it has made things better. But I'm not sure I'll continue. It is their job to crawl my site and find the pages after all.

jlander

10+ Year Member



 
Msg#: 33664 posted 2:32 pm on Apr 1, 2006 (gmt 0)

I would have agreed with you guys up until this morning. A few weeks ago a customer emailed and asked if we had a specific item available but not listed on our site. We found what he was looking for and created a private listing. He did not purchase and we forgot all about it. The only way you could get to the item was if you had the exact URL for it. It was not linked to from any pages on our site. This morning the item sold. The only explanation I have is that it was inadvertently included in our google sitemap. I did an allinurl search for the item code and found it is listed in google.

So, the only reason this item could have sold was because of the XML sitemap.

ahmedtheking

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 3:19 pm on Apr 1, 2006 (gmt 0)

Argh site maps are driving me crazy! I've gone through my code soooo many times, rewriting the XML again and again and again, even copying the example from the Google sitemap area, but it keeps giving me errors, all different each time! Is anyone else having this much hassle?

aaronjf

10+ Year Member



 
Msg#: 33664 posted 4:07 pm on Apr 1, 2006 (gmt 0)

but it keeps giving me errors

It may not be your XML. Commonly over looked in the process of setup is making sure that your .htaccess file is telling the server to parse XML.

While setting up a few shopping carts for friends I found this to be true. I have also heard other developers go through this. You pour over and over your code and all along it was the .htaccess file.

aaronjf

10+ Year Member



 
Msg#: 33664 posted 4:09 pm on Apr 1, 2006 (gmt 0)

It is their job to crawl my site and find the pages after all.

Yes it is "their" job, but as webmasters, it is our job to make sure they can if we need to be in the SERPs. After all, it is an interdependant relationship.

Matt Probert

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 5:21 pm on Apr 1, 2006 (gmt 0)

Uploading an XML sitemap to Google offered our site no benefits. Our site is over 4500 pages, all interlinked. Google regularly spidered the pages (we had to request a slow down) before and after the sitemap was uploaded.

I'm still not sure why I bothered with the site map!

Matt

Fribble

5+ Year Member



 
Msg#: 33664 posted 5:53 pm on Apr 1, 2006 (gmt 0)

Wow, am I the only one who's seen benefit?

For about a month G was having trouble crawling my site, only about half of the pages were indexed. Less than a week after implementation of a google XML sitemap every page was in the index.

It hasn't helped traffic but that's not it's purpose.

aaronjf

10+ Year Member



 
Msg#: 33664 posted 6:40 pm on Apr 1, 2006 (gmt 0)

It hasn't helped traffic but that's not it's purpose.

Yeah, I don't think anyone here is under the impression it would. But, since historically SEs have had such a tough/unstable results with DB driven sites, I was really hoping that we would be able to safely switch.

Well, at least one person has had a good experiance.

milanmk

5+ Year Member



 
Msg#: 33664 posted 6:53 pm on Apr 1, 2006 (gmt 0)

From Google Sitemap FAQ's :

By placing a Sitemap-formatted file on your web server, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.

This collaborative crawling system will allow our crawlers to optimize the usefulness of Google's index for users by improving its coverage and freshness.

Google uses the Sitemaps it receives to add your pages to our search index.

We don't guarantee that we'll crawl or index all of your URLs. However, we use the data in your Sitemap to learn about your site's structure, which will allow us to improve our crawler schedule and do a better job crawling your site in the future. In most cases, webmasters will benefit from Sitemap submission, and in no case will you be penalized for it.

Site verification will give you statistics about your site and information about URLs we couldn't crawl so that you can make changes, if necessary.

After reading all this and maintaining my Sitemap XML file for couple of months, i am not able to add my most important pages to Google's Index. There are no errors shown in Sitemap statistics, but i regularly see that Googlebot spiders those pages now and then.

Anyway, you can try your luck!

ZoltanTheBold

5+ Year Member



 
Msg#: 33664 posted 10:04 pm on Apr 1, 2006 (gmt 0)

Yes it is "their" job, but as webmasters, it is our job to make sure they can if we need to be in the SERPs. After all, it is an interdependant relationship.

I agree, it is a two-way thing. But you have to determine whether the overhead of keeping the sitemap up to date is worth the benefit. If a site is well linked internally it shouldn't really need a Google sitemap. The point I was making is that I'm not sure it really infers much benefit if your site is already easy to crawl. Given the delays I see with indexing it doesn't seem worth the effort.

As for the interdependant bit, that's all well and good. But it's hard to be enthusiastic if you feel they are either ignoring the sitemap, or taking too long to act upon it.

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 33664 posted 10:44 pm on Apr 1, 2006 (gmt 0)

You know, I put a link from my homepage to my google xml site map. The yahoo an msn crawlers came along and found it and then they crawled my site very deeply. Funny thing my sitemap has page rank? Odd eh?

smokeybarnable

5+ Year Member



 
Msg#: 33664 posted 10:51 pm on Apr 1, 2006 (gmt 0)

I am finding sitemaps useful because my site isn't static and I do a cron job that generates and uploads an updated sitemap based on my current catalog every day. It was working fine up until the big daddy update but hopefully google will get its act together. As far as it giving you a better chance of getting indexed I haven't seen any evidence of that yet. However, I think it's a good idea to implement a sitemap and wait for google to catch up.

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 33664 posted 11:17 pm on Apr 1, 2006 (gmt 0)

Hurry up and wait.... Sounds like the military....

Vadim

10+ Year Member



 
Msg#: 33664 posted 1:37 am on Apr 2, 2006 (gmt 0)

Sitemap is for Google to understand your site better.

If your site is easy to understand for Google and it has no errors you probably need not the sitemap, but you never can be absolutely sure.

So there is no harm to have the sitemap just for the case, unless you are going to hide something and would not like to fix the errors hoping that Google will never notice them.

Also you may implement only the features that you believe can help Google.

For example I have the simple list of URLs (txt site map) because I have no strict schedule and cannot tell in advance how often the particular page is updated.

On the other hand, when I add new page, I add it to the sitemap and seems Google finds it more quickly. It does not mean that Google will show it in the index immediately, because it depends also on the page content, links etc, but sitemaps seems help.

Also my site gradually evolved from one topic (theme) to another. I did not include the old topic pages into the sitemap and again, it seems that Google now understands my site better.

The bottom line is that the sitemap might help; it is easy to implement (if you omit the feature that you don't need) and it does no harm (if before adding the sitemap you double check your site for errors).

Vadim.

aaronjf

10+ Year Member



 
Msg#: 33664 posted 2:13 am on Apr 2, 2006 (gmt 0)

I think it's a good idea to implement a sitemap and wait for google to catch up.

Sitemaps have always been a good idea. However, we are talking specifically about the XML Sitemaps Google has suedo asked for to make the indexing of DB diven sites easier to index.

dataguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 2:21 am on Apr 2, 2006 (gmt 0)

I've got a site that typically grows by about 400 pages a day. Google grabs the sitemap about 10 times per day. Often we'll see new pages indexed within a day or two.

I don't have any proof that new pages get indexed faster due to the site maps, but it sure doesn't seem to be hurting anything.

yulia

10+ Year Member



 
Msg#: 33664 posted 12:33 pm on Apr 2, 2006 (gmt 0)

I have removed XML sitemaps from all sites I own or manage. I have noticed that sites that had XML sitemap went supplemental during last update. The site that did not have sitemap are OK. I notices a lot of HTML error in the report. Often, I had an error message, that site map does not exist.

On those sites, where changes and modifications are frequent (new pages created / old pages deleted), it was a total mess in the past few months. Google (and Yahoo) got totally “confused”. I suspect, it might be problems with the server, may be too slow for Google crawl. I am not familiar with the server and cannot modify server’s file by myself. Some of my clients host sites on cheap server – I am not even sure we have access to the server files.

So I removed sitemaps beginning of March – and see improvement with site indexing and fixing supplemental.

I have not noticed that having XML sitemap helps site to be crawled more often, or new pages to be indexed faster.

Oliver Henniges

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 7:00 pm on Apr 2, 2006 (gmt 0)

I launched my sitemap about two months ago and my visitor statistics have risen considerably since then (about 50%), but this may also be due to the whole bigdaddy thing and must not necessarily be a consequence of this sitemap.

I find the statistics quite interesting, e.g. some info on broken links or the search-phrases.

In the first place the sitemap helps to crawl the site, and yes: It always takes only a few days until a new page is in the index.

I launched a huge pdf-file comprising 70MByte and got the info that the spider most of the time swallowed it recently, but had some timeouts in the beginning. How could you know without that sitemap?

ahmedtheking there is/was this 2500 k-bug, your errors don't necessarily come from your xml-code. Just do some research in the groups. I got very helpful answers from there and have the suspicion that these came from someone inside google.

ronburk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 10:33 pm on Apr 2, 2006 (gmt 0)

I am interested in Google sitemaps for exactly one reason: to speed up the indexing of new/changed pages.

Googlebot is highly regular and predictable at my site over the long term. When I add a new page, I can pretty much predict when Googlebot will find them. Most new pages have little page rank, and are only picked up on a complete crawl, which happens approximately twice per month.

So, there's little guesswork here or opinion or "seems like" for my very narrow goal. My logs tell me precisely when the sitemap was picked up, and how many times Googlebot came by without fetching brand new pages that were mentioned in a sitemap that was picked up earlier.

I haven't let the current experiment run an entire month, but I can say that so far the use of a sitemap did not stimulate Googlebot to pick up new/changed pages any sooner than normal.

This is one of those areas where I think Google's traditional unwillingness to share any technical details hurts them. Clearly, it's in everybody's bests interests for sitemaps to get widely used -- we can't have every search engine trying to crawl every page on the Internet every day, and sitemaps could be a way to both reduce that load and increase the completeness of indexing.

But without some clues about what benefits we can predictably expect (even if they just say "this is how it works now, we might change it next month), it's real hard to get motivated to participate. Of course, many folks will participate whether there's any benefit or not, because they take no measurements and rely on superstition/intuition for their feedback. Well, a lot of us keep pushing that elevator button too, just because it "seems" to make the car arrive sooner. Maybe that's good enough.

Sadly, Slurp has picked up all the new pages I've posted this week before Googlebot has, even though I handled Googlebot a sitemap to tell it precisely what's new.

I sure wish the Google folks would think about supplying some details about whether, or under what conditions, supplying a sitemap should provide any particular specific benefit.

netchicken1

5+ Year Member



 
Msg#: 33664 posted 11:45 pm on Apr 2, 2006 (gmt 0)

I have made a few posts on this topic here only to get no replies. I figured that people here must be so clued up they didn't need any extra free traffic from google.

Anyway i downloaded some software to make sitemaps, did mine, and uploaded them.

That was november last year. Since them my monthly traffic has moved from about 10 gig to 28 gig. As my content has only minimally improved I put it down to the massive spidering and indexing of the pages.

Both my html and php pages are indexed, google hits on my php pages are now heaps more than in the past.

The Google sitemap feature also shows you the keywords you are indexed for and the placement of your page in that index. I have numerous pages now in the top 10 hits for words.

My income had also doubled (to modest levels.) I have nothing but positive comments for this system, except I had to move to a bigger package to cover the bandwidth usage, this was covered by the increase in income.

ZoltanTheBold

5+ Year Member



 
Msg#: 33664 posted 7:07 am on Apr 3, 2006 (gmt 0)

I haven't let the current experiment run an entire month, but I can say that so far the use of a sitemap did not stimulate Googlebot to pick up new/changed pages any sooner than normal.

This is one of those areas where I think Google's traditional unwillingness to share any technical details hurts them.

Sadly, Slurp has picked up all the new pages I've posted this week before Googlebot has, even though I handled Googlebot a sitemap to tell it precisely what's new.

I sure wish the Google folks would think about supplying some details about whether, or under what conditions, supplying a sitemap should provide any particular specific benefit.

I think this is it in a nutshell. Google are poor communicators and, increasingly, are hiding behind 'Beta' software to push out technology quickly, without really bothering to think it through.

When I first found out about sitemaps I thought it was a great idea, even though my site was crawled well without it. Naturally I expected it to infer some benefit, mainly in terms of speed. This has not manifested itself and, like others, I'm finding MSN and Yahoo picking up new material in a few days without the overhead of a sitemap.

However there seems to be a real reluctance to criticize any aspect of Google's operation. I don't know why this is since they do so little to instill loyalty.

ronburk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33664 posted 7:56 pm on Apr 3, 2006 (gmt 0)

Since them my monthly traffic has moved from about 10 gig to 28 gig.

Sounds like you were having trouble getting crawled correctly, perhaps due to failing to follow the simple guideines Google supplies for getting crawled correctly.

I have numerous pages now in the top 10 hits for words.

If half your content had never been crawled correctly, and then suddenly got crawled and indexed, it would sure be surprising if that hurt your rankings.

My income had also doubled (to modest levels.)

I bought a new pair of jeans and then my IRS refund came. I don't know why people don't buy new jeans more often.

Superstition ain't the way.

ZoltanTheBold

5+ Year Member



 
Msg#: 33664 posted 8:24 pm on Apr 3, 2006 (gmt 0)

Superstition ain't the way.

At last, another cynic.

This 38 message thread spans 2 pages: 38 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved