|Optimal Sitemap Structure by Flattening?|
I'd like to create a sitemap page for our website -- not a Google Sitemap, but just a normal HTML page with links to the various sections of our site, to help with search engines. The problem with our site is that we have about 17,000 products in 2,200 categories, and the category structure goes several levels deep. A typical path to a product might be:
Home -> Cat 1 -> Cat 1.A -> Cat 1.A.1 -> Cat 1.A.1.A -> Product
Because of this deep nesting structure, our product pages and deep category pages don't rank very well in Google. I'm thinking that if I can flatten the categories structure in a sitemap. Any advice on how to do this? Is it true that I shouldn't have more than 100 links on a page? Say I want to link to every category. If I want to link to each of our 2,200 categories should I have on the main sitemap page links to say 22 sitemap subpages and then on each of those have links to 100 categories, thereby having links to every category, or is there a more optimal structure in the eyes of Google? Thanks!
Have you got breadcrumb navigation to get visitors and bots round the site?
If not, do that first.
Are you absolutely sure that each page of the site has only URL that you can access it by?
That is, no www and non-www problems of any sort, no extra query strings or parameters, versus "bare" URLs, and so on?
Sure, that is a way to do it... 22x100 or 100x22, whatever.
Don't think about depth of directories, think about click path or click depth from the home page or nearest inbound link. It's not the number of slashes in the url, it's the number of clicks it takes to get there that kills you.
I just added breadcrumb navigation. I haven't put it up on the site yet but I've programmed it into the development code and am planning on putting it up to the site this week.
I also recently made sure that every page is distinct -- previously we had several domains that could be used to access the site (plus a non-www address) but I used apache redirects to redirect all those other URLs to the main www domain URL.
I realize that it's not the directories, rather it is the number of clicks. I'm just wondering what is the optimal structure. I could just list all the 2,200 categories on the main sitemap page and then it would just be 1 click to each subcategory from the sitemap page...however, it seems like Google wouldn't like that many links on one page. If I do 22x100 links or 100x22 links then it would be 2 clicks to each subcategory. Is that the ideal structure? Do I need to be absolutely strict about only having 100 links on each page or is that just an approximate rule? Thanks.
I am not sure how adding breadcumb navigation will change the PR flow within the whole site. I have never did it with an existing site, but I think it will change the whole supplemental structure.
I'm not sure how the breadcrumb will change the PR flow either...right now our results are good for our main queries, which go right to the home page, but the results are very poor for searches that should yield specific subpages. Before the supplemental query went away I think about 80% of the site was in the supplemental. Will the breadcrumb help this?
First of all, are you creating a site map for your users, or for better indexing in Search engines?
If the last one, here is an idea:
try to create a rollover site map. You could code it so the site map changes the link positions every two-three days (depends how often google bot visit your site).
Example: Your sitemap consist of 100 pages with url's.
first 3 days: only 50 links at the first page,
3-6 days: 50 from last time, are moved to the end, and you displya 50 new at the first page.
6-9 same again etc....
I'm also curious how a breadcrumb navigation would help get pages out of supplemental status.
I recently found that new pages I was adding to one of my sites were going supplemental. It seemed to me that the "index" page which linked to the pages was causing this. To get round the problem I have totally flattened the non-DHTML navigation of this site so that all non-DHTML navigation now comes from the sitemap. This seemed to work in a less drastic experiment on another site.
So now, users still have the DHTML type navigation but Google only sees the sitemap linking to all the pages in the site. I reckon there around 300 links from the sitemap and await to see if this is a problem when Google re-indexes. I suspect not, I think Google recognises it as a sitemap.
On another much larger site I have around 400 links from the sitemap (not to all pages) and that page is not supplemental. Do we have any hard evidence that Google objects to pages with large numbers of outbound links? My experience (up to 400) seems to suggest otherwise, at least for a sitemap.
"I'm also curious how a breadcrumb navigation would help get pages out of supplemental status."
Adding more links to pages and taking some of your pagerank from higher level pages will will help.
Higher PR and more crawl paths battle supplemental status, but there is no guarantee for anything PR4 or under, or with ten or few links to a page.
|no guarantee for anything PR4 or under, or with ten or few links to a page. |
define: OR % Operator
Will a (less than PR4) site with 11+ internal links to each page - help?
One more link will help. Will 11 more low PR links help enough, there is no way to know that in general, but the more ways for the bot to crawl to a page and the more PR that page has, the better.