Forum Moderators: open

Message Too Old, No Replies

Best Site Map Structure

What is the most effective format?

         

stuntdubl

6:24 pm on Feb 16, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sure this has probably been discussed previously, but I tried searching the archives, and didn't find much.

I am finally going to put sitemaps on my sites, as I have recently checked how Gbot spiders and saw just how random it was. It seemed to hit lots of the pages that weren't important to me getting spidered several times.

Anyhow,

What is the best approach here?
Tree Structure?
List Structure?

Also,
should I put a link to the sitemap somewhere on each page?

Any help or expertise in this area would be appreciated.

born2drv

7:36 pm on Feb 16, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>>What is the best approach here?
Tree Structure?
List Structure?

I have one massive page that is just a list of every link on the site. It is dynamically generated for me, but looks like a .html file.

The main categories are bolded with larger fonts, so you can see the structure of the site. The smallest fonts are the actual items in each sub category. I don't think it really matters, as long as all the links are there. All my recip. links are at the bottom of this page.

>>>should I put a link to the sitemap somewhere on each page?

I would. I think the freshbot takes notice of pages with higher PR as it has been discussed. Also, the higher PR will benefit all links in the page, no matter how buried they may be in your site. So those 3rd, 4th+ level pages will all have a high PR page pointing to them.

stuntdubl

4:09 pm on Feb 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you noticed better spidering of your pages due to a sitemap and its' structure?

John_Caius

4:25 pm on Feb 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



On the same subject:

We have upwards of 20,000 dynamically-generated (cold fusion) pages that Google has refused to crawl. Finally we achieved some success by generating a static cache of the site with Google crawling our sitemap.

We currently have 20 sitemap pages each with 1,000 links on it. The number of pages in the Google index fluctuates up and down between about 7,000 and (currently) 15,000 according to the activity of freshbot on the site. Recently it dropped from 11,000 to 7,000 and our traffic fell by a similar factor.

Google clearly spiders at least 75% of the links on a 1,000 link sitemap but I would appreciate advice on how we could better organise the sitemap structure. The pages do have about 500 content hubs (of varying importance) in amongst those 20,000 pages but currently Google sees them all as identical.