Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
This seems a bit unweildy. I'm in the midst of redesigning the site, and wondering if there isn't some alternative. I have every confidence that Googlebot, and other bots, will follow all the pages. But must the proliferation of site map pages continue as the site grows?
You may want to consider setting up those pages Directory Style and gain the added benefit of using breadcrumbs to help your visitor navigate their way around the site.
We've seen an increase in site map usage amongst most of our clients, especially those in the industrial sectors. We might have one mother site map and then children site maps for each sub-directory. Mom links to the children, the children link to the other children, and they all link back to mom.
Some have different perspectives on this and I can only share with you what has worked for me over the years. I utilize FP and it gives me some rather neat features to view the navigation structure of my sites. Man is that a pretty sight! ;)
P.S. No need to think about the bots. Think about your users first, the bots are a given. If you structure the pages correctly, the bots will come and follow.
P.S.S. Make sure Mom is linked from all pages of the site. She holds the key.
I hadn't thought about the "bread crumbs" analogy for a while, but it makes eminent good sense. I guess my work is set out for me. Sigh!
Did you read this on Google's web site? I'm curious to know where this information comes from.
Google Information for Webmasters [google.com]:
Since then, I've re-read that Google page several times. After all, where else can we get the word from the horses mouth except from Googleguy. Nice horsey, too, that Googleguy.
I bolded the part of the above sentence that I consider the most important. If you end up with more than a hundred links on your sitemap, then you may be linking to each individual page. This is probably confusing for your visitors and Googlebot alike.
If you think you need more than one page for your sitemap, then should consider increasing its granularity instead.
But then again, do they worry about PR or spidering?
Indeed. Just the other day I split my sitemap into four pages in response to GoogleGuy's suggestion. If advice like that comes from the horse's mouth, It think we ignore it at our peril.
Perhaps Google just stops spidering after 100 links, or maybe your site's relevance is subtly downgraded. Perhaps this is why my site ranks several places below where I think it ought to, according to PR and number of links? Maybe I'll find out at the next re-indexing?
But it's odd that Google would make that kind of suggestion to webmasters. When this issue came up a few weeks ago, I had a mini sitemap on my homepage with over 200 links; out of paranoia I have cut that down to about 120 for the next dance.
I don't know about you, but when I hit a page that is full of nothing but links with little to no structure, it's useless to me. Many treat their site maps as just spider food. To me, that is the wrong way to approach them.
Site maps or site directories are to be set up and used for visitor navigation. Think of your site as one big book. Your site map is like the table of contents. Your secondary site maps are like the sub table of contents, you know the indented areas that fall under the table of content headings. The bigger the book, the more pages it has listing the table of contents.
Do you actually think a user is going to find much on a page with 100 or more links? I've seen very few usable pages with that many links. In fact, they start to look like a link farm in many cases.
Careful planning and consideration needs to go into the structuring of site maps and the way they are interlinked with the site. As I mentioned above, the bots will come and spider, that's a given. Think about your users, those who may navigate using the site map, is it convenient and easy to use for them?
From the perspective of "Search Quality", I've always wondered if (and how) Google inhibits those sitemaps and long pages full of links, from appearing high up in the SERPs. It's been my impression that Google rarely gives high ranking to those kinds of pages.
PageRank considerations probably explain that, because external links rarely link to a sitemap. But could there be an additional mechanism at work, to de-emphasize pages with long lists of links?
As an aside, I made this observation about my sitemap. A couple months ago, my sitemap was PR=6 and the first 34 links off the sitemap also had PR=6, but lower links on the page had PR=5. There's probably nothing special about the number 34, but maybe there's a slight progressive reduction in PR transferred, according to the number of preceeding links on the page.
If that could be true, maybe having long sitemaps isn't optimal. I don't know, I'm just asking, to see if others have thought about it this way.
Though I currenly use some of pageoneresults ideas it's all been kind of hit or miss depeding on which section of the site I'm working on. This time around I'm going to take a day or two to do nothing but site maps, all at the same time to make sure that the structure and such are all as similar as can be.
Breadcrumbs are a great idea and they will definitely show up in the next version.
As for SERP ranking I just read in a current thread where somebody elses site maps are pulling high positions for related but non-targeted keywords. One of my very, very long site maps, an over 250 link big list of widgets has always pulled a lot of traffic. All accidental of course. Go figure.