Forum Moderators: open

Message Too Old, No Replies

Site structure question

will the addition of an extra level impact the crawling of higher levels

         

trimmer80

1:38 am on Sep 3, 2004 (gmt 0)

10+ Year Member



Hi,

Just wanted to get some opinions on a structural change.

I have a site with 1500 Pages.

The site structure is as follows:

Level 1 - Home Page

Level 2 - 50 Category Pages

Level 3 - 1400 Sub Category Pages

The end level subcategory pages often has much content that dilutes the effectiveness of keywords on the page. I am looking at moving products with large descriptions to their own pages.

Thus the new structure would look like this.

Level 1 - Home Page

Level 2 - 50 Category Pages

Level 3 - 1400 Sub Category Pages

Level 4 - 10000 Product Description Pages

However it is crucial that the pages from level 1 - 3 be updated regularly. Currently these pages are all updated in a rotation of about 2 weeks.

My question is:
Do you think the addition of the 10000 pages will slow down the time taken for the google bot to update the pages on level 1 - 3.

any opinions welcome.

karmov

1:55 pm on Sep 3, 2004 (gmt 0)

10+ Year Member



Do you think the addition of the 10000 pages will slow down the time taken for the google bot to update the pages on level 1 - 3.

In my experience no. Though I've never added that many pages at once. It may take Google a long time to find those 10000 bottom level pages, but having lots of pages at the bottom tier, should not affect how Google treats your higher level ones.

G treats the web as a sequence on pages rather than sites for most things that it does. This is why you don't see a single bot crawling your website, but rather a host of bots grabbing a bunch of pages that are linked together. This means that adding a lot of pages for the bot to crawl shouldn't mean any less crawling for pages that it already knows about.

trimmer80

8:23 pm on Sep 5, 2004 (gmt 0)

10+ Year Member



thanks karmov,

that is what i thought. I was just thinking if PR dicates how deep the bots crawl it might also dicate how many pages it revisits and how often. I would assume because of pr distribution it would continue to crawl the high levels at the same rate.

Marcia

10:53 pm on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not so sure I'd add 10K pages to a site all at once at this point in time. Are you saying that it will be 10K pages of individually written, unique content?

How about breaking out some with larger descriptions onto their own pages and seeing what happens with them getting crawled and indexed?

trimmer80

11:21 pm on Sep 5, 2004 (gmt 0)

10+ Year Member



good idea marcia.

The content is unique in that it will be a 'more information' page for unique products. However many products can have similar specs, (might be exactly the same except for one spec). This is part of my motivation for moving the larger descriptions out of a page of 50 products as the density of certain keywords can become quite high.

I will try with a few chosen categories at first.

Cheer,

ThomasB

11:30 pm on Sep 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I haven't seen any problems with adding thousands of new pages lately. On a site I added about 15k pages of genuine content and GBot already crawled 6.2k and 4530 are showing for the site:-command.

As for the linking structure, I'd try the following:
Do the linking structure as you planned and link from Level 2 to a few money-phrases in Level 4. That way you will have a bit more internal-linking power.

As I guess you already have deeplinks to Level 3 I would try to get a few deeplinks to Level 4 in order to achieve faster crawling and ranking of the new pages.

Marcia

2:46 am on Sep 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I have a problem with is, looking at it from what I assume could be Google's point of view, is what value a bunch of widget pages have with nothing different on them but size 6, size 8, size 10, etc. and color: blue, color: red, color: green, etc. That does sometimes happen.

If I lived 100 times longer than what my whole normal lifetime would be I'd never be able to write original, individual content for each of tens of thousands of pages. I could, however, generate thousands of pages from feeds.

A few questions that cross my mind are: how much of the pages are identical to what percentage point, does Google care about whether duplicate or near-duplicate content fills their index, are they capable of detecting it, and if they can detect it, do they care? If they do care, will they do something about it, or is the algo so broke that they just throw their hands up about it?

trimmer80

8:50 pm on Sep 6, 2004 (gmt 0)

10+ Year Member



If I lived 100 times longer than what my whole normal lifetime would be I'd never be able to write original, individual content for each of tens of thousands of pages. I could, however, generate thousands of pages from feeds.

This from a user with 6870 posts....... :D

By my calculations, each product has 1 full paragraph. I could easily write 30 paragraphs on products a day thus get 10000 in under a year :p

Anyway, we have 30 wholesalers and we select the best products, prices and descriptions from each of their price lists. Then we take these names and descriptions and standardize them into our naming format.
Much time has been spent creating product names and descriptions for these products.

Yes some of them are similar as many products have similar specs. But the spec differences are very important to the customer (brand, size) and is likely to be criteria that a surf searches for.

No one else has this content online... thus it is unique, non duplicate.