Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
joined:Mar 8, 2002
A new site with 8000 pages of content right off the bat?
Not as hard as you might imagine... Just a database
Yep, if you have solid incoming links to root. So long as you have an easy path from root to internal links, Googlebot will find them. I've even had the issue of Google indexing orphan pages I wanted to abandon. Even though I dropped the links to these pages on my site, Googlebot found them from a link from a minor blog. Googlebot is voracious.
For about two weeks.
Never seen him again since then.
I also use script generated pages for my products catalogue, however with a very inhomogenous (i.e. "natural") hierarchical structure of three to four levels of groups of products, groups of groups of products and so forth.
I also put considerable effort in these groups making sense and thus do follow googles first law: Concentrate on the user.
If your site is new and you have nothing to loose, you may give it a try. If you just want traffic, be sure the porn industry has had this genious idea about ten years ago.
Let me estimate that in order to convince googles algo it'd be worth to index all these pages, you need four to twelve unique lexical entries (=words) on each of these pages. If you typed more than 40000 words into your database by hand I'd also bet google will..
Google will work their way out from each of these entry points.
You should still understand that it can take a couple of months for Google to work their way to all your pages if you so not have links coming in from a variety of sites.
If the pages have unique, useful content, there is no reason they should not be indexed assuming they have good incoming links.