| 11:48 pm on Jul 26, 2007 (gmt 0)|
Have you got breadcrumb navigation to get visitors and bots round the site?
If not, do that first.
Are you absolutely sure that each page of the site has only URL that you can access it by?
That is, no www and non-www problems of any sort, no extra query strings or parameters, versus "bare" URLs, and so on?
| 11:50 pm on Jul 26, 2007 (gmt 0)|
Sure, that is a way to do it... 22x100 or 100x22, whatever.
| 4:44 am on Jul 27, 2007 (gmt 0)|
Don't think about depth of directories, think about click path or click depth from the home page or nearest inbound link. It's not the number of slashes in the url, it's the number of clicks it takes to get there that kills you.
| 12:39 am on Jul 30, 2007 (gmt 0)|
I just added breadcrumb navigation. I haven't put it up on the site yet but I've programmed it into the development code and am planning on putting it up to the site this week.
I also recently made sure that every page is distinct -- previously we had several domains that could be used to access the site (plus a non-www address) but I used apache redirects to redirect all those other URLs to the main www domain URL.
I realize that it's not the directories, rather it is the number of clicks. I'm just wondering what is the optimal structure. I could just list all the 2,200 categories on the main sitemap page and then it would just be 1 click to each subcategory from the sitemap page...however, it seems like Google wouldn't like that many links on one page. If I do 22x100 links or 100x22 links then it would be 2 clicks to each subcategory. Is that the ideal structure? Do I need to be absolutely strict about only having 100 links on each page or is that just an approximate rule? Thanks.
| 1:07 am on Jul 30, 2007 (gmt 0)|
I am not sure how adding breadcumb navigation will change the PR flow within the whole site. I have never did it with an existing site, but I think it will change the whole supplemental structure.
| 11:57 pm on Jul 30, 2007 (gmt 0)|
I'm not sure how the breadcrumb will change the PR flow either...right now our results are good for our main queries, which go right to the home page, but the results are very poor for searches that should yield specific subpages. Before the supplemental query went away I think about 80% of the site was in the supplemental. Will the breadcrumb help this?
| 12:02 pm on Jul 31, 2007 (gmt 0)|
First of all, are you creating a site map for your users, or for better indexing in Search engines?
If the last one, here is an idea:
try to create a rollover site map. You could code it so the site map changes the link positions every two-three days (depends how often google bot visit your site).
Example: Your sitemap consist of 100 pages with url's.
first 3 days: only 50 links at the first page,
3-6 days: 50 from last time, are moved to the end, and you displya 50 new at the first page.
6-9 same again etc....
| 2:52 pm on Jul 31, 2007 (gmt 0)|
I'm also curious how a breadcrumb navigation would help get pages out of supplemental status.
| 6:34 pm on Jul 31, 2007 (gmt 0)|
I recently found that new pages I was adding to one of my sites were going supplemental. It seemed to me that the "index" page which linked to the pages was causing this. To get round the problem I have totally flattened the non-DHTML navigation of this site so that all non-DHTML navigation now comes from the sitemap. This seemed to work in a less drastic experiment on another site.
So now, users still have the DHTML type navigation but Google only sees the sitemap linking to all the pages in the site. I reckon there around 300 links from the sitemap and await to see if this is a problem when Google re-indexes. I suspect not, I think Google recognises it as a sitemap.
On another much larger site I have around 400 links from the sitemap (not to all pages) and that page is not supplemental. Do we have any hard evidence that Google objects to pages with large numbers of outbound links? My experience (up to 400) seems to suggest otherwise, at least for a sitemap.
| 8:33 pm on Jul 31, 2007 (gmt 0)|
"I'm also curious how a breadcrumb navigation would help get pages out of supplemental status."
Adding more links to pages and taking some of your pagerank from higher level pages will will help.
Higher PR and more crawl paths battle supplemental status, but there is no guarantee for anything PR4 or under, or with ten or few links to a page.
| 5:13 pm on Aug 17, 2007 (gmt 0)|
|no guarantee for anything PR4 or under, or with ten or few links to a page. |
define: OR % Operator
Will a (less than PR4) site with 11+ internal links to each page - help?
| 9:32 pm on Aug 20, 2007 (gmt 0)|
One more link will help. Will 11 more low PR links help enough, there is no way to know that in general, but the more ways for the bot to crawl to a page and the more PR that page has, the better.