Msg#: 4496631 posted 11:54 am on Sep 18, 2012 (gmt 0)
- make sure you have a good information architecture in place - the url structure will be critical at that scale and you want to avoid serving the same content from multiple urls (you don't want 100,000 indexed urls) - don't publish any of the "stub" pages - consider carefully what you do with your high value pages - whether they are well-linked, good sources of traffic, important to structure, etc - have a redirect plan in place
Lots of pages isn't inherently bad. My current site has about 900,000 pages, and I've worked with sites up to 200 million pages.
In addition to the content creation and social benefits you mentioned, moving to multiple pages allows you to target a lot more keywords with a lot more title tags.
I second what phranque says. Just to add a little more detail around the information architecture bit:
When you get a lot of pages you need to pay more attention to your Googlebot crawl budget. There will be pages that Googlebot may crawl only once *ever* or maybe even not at all, depending on your pagerank. You need to make sure that your pages that get the most traffic and change most frequently get crawled more often. You can set priority and last-modified in your sitemap to make that happen. And you can "sculpt" your pagerank so that your important pages don't get PR 0.
You also need to pay more attention to user experience. Google doesn't like it when a page ranks for something popular, but has little content: users end up unsatisfied. If you have pages with little content, make sure they are in the backwaters of your site, multiple clicks from the homepage. Conversely, pages that are most fleshed out and most helpful should be linked from the home page itself.
You will probably need some great navigational structure in place. Maybe even multiple navigation methods. Things like related items, and items in multiple categories/hierarchies. Plus site search for users (although totally useless to googlebot.)