|Site Map Bloat|
Is it necessary to have multiple site map pages?
A while back, after reading that Google recommends that links on a given page be limited to no more than 100, I broke out the site map for my largest site to 10 pages, all linked together, of course, for the bot.
This seems a bit unweildy. I'm in the midst of redesigning the site, and wondering if there isn't some alternative. I have every confidence that Googlebot, and other bots, will follow all the pages. But must the proliferation of site map pages continue as the site grows?
I've always been a firm believer that there needs to be one page, or a group of pages that link the entire site together, it creates wholeness.
You may want to consider setting up those pages Directory Style and gain the added benefit of using breadcrumbs to help your visitor navigate their way around the site.
We've seen an increase in site map usage amongst most of our clients, especially those in the industrial sectors. We might have one mother site map and then children site maps for each sub-directory. Mom links to the children, the children link to the other children, and they all link back to mom.
Some have different perspectives on this and I can only share with you what has worked for me over the years. I utilize FP and it gives me some rather neat features to view the navigation structure of my sites. Man is that a pretty sight! ;)
P.S. No need to think about the bots. Think about your users first, the bots are a given. If you structure the pages correctly, the bots will come and follow.
P.S.S. Make sure Mom is linked from all pages of the site. She holds the key.
Thanks Pageone - I've always admired the site map you employ on one of your sites. Your suggestion of a directory style probably makes great sense as the site grows. As you point out, it'll be friendly to visitors as well as helping me to sort things out better with a sound organizational set up.
I hadn't thought about the "bread crumbs" analogy for a while, but it makes eminent good sense. I guess my work is set out for me. Sigh!
|Google recommends that links on a given page be limited to no more than 100 |
Did you read this on Google's web site? I'm curious to know where this information comes from.
|Did you read this on Google's web site? I'm curious to know where this information comes from. |
Google Information for Webmasters [google.com]:
Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
I wonder if this is a new version, or if I just scanned it and missed it.
Also, I see that they actually forbid Web Position Gold by name! I recall Google prohibit auto queries, but never before mentioning WPG by name.
Humph. Interesting. Thanks for the heads-up.
Yeah - This kind of thing points up the immense value of participating in this board. I actually picked up that little tidbit from another thread which I can't remember now.
Since then, I've re-read that Google page several times. After all, where else can we get the word from the horses mouth except from Googleguy. Nice horsey, too, that Googleguy.
Offer a site map to your users with links that point to the important parts of your site.
I bolded the part of the above sentence that I consider the most important. If you end up with more than a hundred links on your sitemap, then you may be linking to each individual page. This is probably confusing for your visitors and Googlebot alike.
If you think you need more than one page for your sitemap, then should consider increasing its granularity instead.
Of course, as someone in that other thread (sorry, can't find it either) pointed out, Google's own site map has 124 links. So presumably we shouldn't sweat too much about a few links over the 100.
But then again, do they worry about PR or spidering?
|But then again, do they worry about PR or spidering? |
Indeed. Just the other day I split my sitemap into four pages in response to GoogleGuy's suggestion. If advice like that comes from the horse's mouth, It think we ignore it at our peril.
Perhaps Google just stops spidering after 100 links, or maybe your site's relevance is subtly downgraded. Perhaps this is why my site ranks several places below where I think it ought to, according to PR and number of links? Maybe I'll find out at the next re-indexing?
Google does recognize and presumably follow links well beyond the 100 limit ceiling they suggest - i.e. the links do show up with the link: command.
But it's odd that Google would make that kind of suggestion to webmasters. When this issue came up a few weeks ago, I had a mini sitemap on my homepage with over 200 links; out of paranoia I have cut that down to about 120 for the next dance.
I think Google's suggestion is from a users perspective, not a spidering one.
I don't know about you, but when I hit a page that is full of nothing but links with little to no structure, it's useless to me. Many treat their site maps as just spider food. To me, that is the wrong way to approach them.
Site maps or site directories are to be set up and used for visitor navigation. Think of your site as one big book. Your site map is like the table of contents. Your secondary site maps are like the sub table of contents, you know the indented areas that fall under the table of content headings. The bigger the book, the more pages it has listing the table of contents.
Do you actually think a user is going to find much on a page with 100 or more links? I've seen very few usable pages with that many links. In fact, they start to look like a link farm in many cases.
Careful planning and consideration needs to go into the structuring of site maps and the way they are interlinked with the site. As I mentioned above, the bots will come and spider, that's a given. Think about your users, those who may navigate using the site map, is it convenient and easy to use for them?
I agree with the discussion above, and want to see if you agree with this. Suppose for a minute, that sitemaps are not primarily spider-food, but a kind of content for the users benefit.
From the perspective of "Search Quality", I've always wondered if (and how) Google inhibits those sitemaps and long pages full of links, from appearing high up in the SERPs. It's been my impression that Google rarely gives high ranking to those kinds of pages.
PageRank considerations probably explain that, because external links rarely link to a sitemap. But could there be an additional mechanism at work, to de-emphasize pages with long lists of links?
As an aside, I made this observation about my sitemap. A couple months ago, my sitemap was PR=6 and the first 34 links off the sitemap also had PR=6, but lower links on the page had PR=5. There's probably nothing special about the number 34, but maybe there's a slight progressive reduction in PR transferred, according to the number of preceeding links on the page.
If that could be true, maybe having long sitemaps isn't optimal. I don't know, I'm just asking, to see if others have thought about it this way.
I'm a firm believer in site maps for both the visitors and the spiders.
Though I currenly use some of pageoneresults ideas it's all been kind of hit or miss depeding on which section of the site I'm working on. This time around I'm going to take a day or two to do nothing but site maps, all at the same time to make sure that the structure and such are all as similar as can be.
Breadcrumbs are a great idea and they will definitely show up in the next version.
As for SERP ranking I just read in a current thread where somebody elses site maps are pulling high positions for related but non-targeted keywords. One of my very, very long site maps, an over 250 link big list of widgets has always pulled a lot of traffic. All accidental of course. Go figure.
Yeah - I get a lot of "accidental" rankings for my site maps. Yet another reason to make them user friendly. Lots of work cut out here.