Forum Moderators: open
Just wanted to get some opinions on a structural change.
I have a site with 1500 Pages.
The site structure is as follows:
Level 1 - Home Page
Level 2 - 50 Category Pages
Level 3 - 1400 Sub Category Pages
The end level subcategory pages often has much content that dilutes the effectiveness of keywords on the page. I am looking at moving products with large descriptions to their own pages.
Thus the new structure would look like this.
Level 1 - Home Page
Level 2 - 50 Category Pages
Level 3 - 1400 Sub Category Pages
Level 4 - 10000 Product Description Pages
However it is crucial that the pages from level 1 - 3 be updated regularly. Currently these pages are all updated in a rotation of about 2 weeks.
My question is:
Do you think the addition of the 10000 pages will slow down the time taken for the google bot to update the pages on level 1 - 3.
any opinions welcome.
Do you think the addition of the 10000 pages will slow down the time taken for the google bot to update the pages on level 1 - 3.
In my experience no. Though I've never added that many pages at once. It may take Google a long time to find those 10000 bottom level pages, but having lots of pages at the bottom tier, should not affect how Google treats your higher level ones.
G treats the web as a sequence on pages rather than sites for most things that it does. This is why you don't see a single bot crawling your website, but rather a host of bots grabbing a bunch of pages that are linked together. This means that adding a lot of pages for the bot to crawl shouldn't mean any less crawling for pages that it already knows about.
The content is unique in that it will be a 'more information' page for unique products. However many products can have similar specs, (might be exactly the same except for one spec). This is part of my motivation for moving the larger descriptions out of a page of 50 products as the density of certain keywords can become quite high.
I will try with a few chosen categories at first.
Cheer,
As for the linking structure, I'd try the following:
Do the linking structure as you planned and link from Level 2 to a few money-phrases in Level 4. That way you will have a bit more internal-linking power.
As I guess you already have deeplinks to Level 3 I would try to get a few deeplinks to Level 4 in order to achieve faster crawling and ranking of the new pages.
If I lived 100 times longer than what my whole normal lifetime would be I'd never be able to write original, individual content for each of tens of thousands of pages. I could, however, generate thousands of pages from feeds.
A few questions that cross my mind are: how much of the pages are identical to what percentage point, does Google care about whether duplicate or near-duplicate content fills their index, are they capable of detecting it, and if they can detect it, do they care? If they do care, will they do something about it, or is the algo so broke that they just throw their hands up about it?
If I lived 100 times longer than what my whole normal lifetime would be I'd never be able to write original, individual content for each of tens of thousands of pages. I could, however, generate thousands of pages from feeds.
This from a user with 6870 posts....... :D
By my calculations, each product has 1 full paragraph. I could easily write 30 paragraphs on products a day thus get 10000 in under a year :p
Anyway, we have 30 wholesalers and we select the best products, prices and descriptions from each of their price lists. Then we take these names and descriptions and standardize them into our naming format.
Much time has been spent creating product names and descriptions for these products.
Yes some of them are similar as many products have similar specs. But the spec differences are very important to the customer (brand, size) and is likely to be criteria that a surf searches for.
No one else has this content online... thus it is unique, non duplicate.