homepage Welcome to WebmasterWorld Guest from 54.167.249.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Optimal Sitemap Structure by Flattening?
DarkNova




msg:3405483
 6:09 pm on Jul 26, 2007 (gmt 0)

I'd like to create a sitemap page for our website -- not a Google Sitemap, but just a normal HTML page with links to the various sections of our site, to help with search engines. The problem with our site is that we have about 17,000 products in 2,200 categories, and the category structure goes several levels deep. A typical path to a product might be:

Home -> Cat 1 -> Cat 1.A -> Cat 1.A.1 -> Cat 1.A.1.A -> Product

Because of this deep nesting structure, our product pages and deep category pages don't rank very well in Google. I'm thinking that if I can flatten the categories structure in a sitemap. Any advice on how to do this? Is it true that I shouldn't have more than 100 links on a page? Say I want to link to every category. If I want to link to each of our 2,200 categories should I have on the main sitemap page links to say 22 sitemap subpages and then on each of those have links to 100 categories, thereby having links to every category, or is there a more optimal structure in the eyes of Google? Thanks!

 

g1smd




msg:3405752
 11:48 pm on Jul 26, 2007 (gmt 0)

Have you got breadcrumb navigation to get visitors and bots round the site?

If not, do that first.

.

Are you absolutely sure that each page of the site has only URL that you can access it by?

That is, no www and non-www problems of any sort, no extra query strings or parameters, versus "bare" URLs, and so on?

steveb




msg:3405753
 11:50 pm on Jul 26, 2007 (gmt 0)

Sure, that is a way to do it... 22x100 or 100x22, whatever.

tedster




msg:3405957
 4:44 am on Jul 27, 2007 (gmt 0)

Don't think about depth of directories, think about click path or click depth from the home page or nearest inbound link. It's not the number of slashes in the url, it's the number of clicks it takes to get there that kills you.

DarkNova




msg:3408177
 12:39 am on Jul 30, 2007 (gmt 0)

I just added breadcrumb navigation. I haven't put it up on the site yet but I've programmed it into the development code and am planning on putting it up to the site this week.

I also recently made sure that every page is distinct -- previously we had several domains that could be used to access the site (plus a non-www address) but I used apache redirects to redirect all those other URLs to the main www domain URL.

I realize that it's not the directories, rather it is the number of clicks. I'm just wondering what is the optimal structure. I could just list all the 2,200 categories on the main sitemap page and then it would just be 1 click to each subcategory from the sitemap page...however, it seems like Google wouldn't like that many links on one page. If I do 22x100 links or 100x22 links then it would be 2 clicks to each subcategory. Is that the ideal structure? Do I need to be absolutely strict about only having 100 links on each page or is that just an approximate rule? Thanks.

SEOPTI




msg:3408186
 1:07 am on Jul 30, 2007 (gmt 0)

I am not sure how adding breadcumb navigation will change the PR flow within the whole site. I have never did it with an existing site, but I think it will change the whole supplemental structure.

DarkNova




msg:3409045
 11:57 pm on Jul 30, 2007 (gmt 0)

I'm not sure how the breadcrumb will change the PR flow either...right now our results are good for our main queries, which go right to the home page, but the results are very poor for searches that should yield specific subpages. Before the supplemental query went away I think about 80% of the site was in the supplemental. Will the breadcrumb help this?

Anghus




msg:3409427
 12:02 pm on Jul 31, 2007 (gmt 0)

First of all, are you creating a site map for your users, or for better indexing in Search engines?

If the last one, here is an idea:

try to create a rollover site map. You could code it so the site map changes the link positions every two-three days (depends how often google bot visit your site).

Example: Your sitemap consist of 100 pages with url's.

first 3 days: only 50 links at the first page,

3-6 days: 50 from last time, are moved to the end, and you displya 50 new at the first page.

6-9 same again etc....

Tonearm




msg:3409575
 2:52 pm on Jul 31, 2007 (gmt 0)

I'm also curious how a breadcrumb navigation would help get pages out of supplemental status.

nomis5




msg:3409785
 6:34 pm on Jul 31, 2007 (gmt 0)

I recently found that new pages I was adding to one of my sites were going supplemental. It seemed to me that the "index" page which linked to the pages was causing this. To get round the problem I have totally flattened the non-DHTML navigation of this site so that all non-DHTML navigation now comes from the sitemap. This seemed to work in a less drastic experiment on another site.
So now, users still have the DHTML type navigation but Google only sees the sitemap linking to all the pages in the site. I reckon there around 300 links from the sitemap and await to see if this is a problem when Google re-indexes. I suspect not, I think Google recognises it as a sitemap.

On another much larger site I have around 400 links from the sitemap (not to all pages) and that page is not supplemental. Do we have any hard evidence that Google objects to pages with large numbers of outbound links? My experience (up to 400) seems to suggest otherwise, at least for a sitemap.

steveb




msg:3409915
 8:33 pm on Jul 31, 2007 (gmt 0)

"I'm also curious how a breadcrumb navigation would help get pages out of supplemental status."

It won't.

Adding more links to pages and taking some of your pagerank from higher level pages will will help.

Higher PR and more crawl paths battle supplemental status, but there is no guarantee for anything PR4 or under, or with ten or few links to a page.

JohnRoy




msg:3424900
 5:13 pm on Aug 17, 2007 (gmt 0)

no guarantee for anything PR4 or under, or with ten or few links to a page.

define: OR % Operator

Will a (less than PR4) site with 11+ internal links to each page - help?

steveb




msg:3427341
 9:32 pm on Aug 20, 2007 (gmt 0)

One more link will help. Will 11 more low PR links help enough, there is no way to know that in general, but the more ways for the bot to crawl to a page and the more PR that page has, the better.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved