A site i work with updates with a new sub-category every once in a while. That usually amounts to 10K-50K pages in one go.
Releasing the pages at the same time has never been a problem, even a few years back when the site was very small and had few links incoming.
Getting those fancy and sought-after deep links, that's the problem... As long as you are publishing good, relevant and above all unique content you should be fine.
Are the 5K old pages also going to stay online? Or do they have a "new" home in the new version? How you choose to do this may well have an affect on your rankings - that is if you are tinkering with the pages and URLs already indexed and ranking.
I released 40K pages all at once about a year ago with no penalty.
Google indexed then all in about 3 days.
launch them and pray
you will not get a penalty. Just be advised that these pages probably will not rank too high since I am betting these pages do not offer any kind of unique, quality content. What I like to do with scraper/search result type scripts that generate a ton of pages is implement ajax instand edit and hire a content writer to create content for many of the pages.
Two months ago we launched a site with 25k pages. No penalty, indexed under a week.
Let me give a scenario. If you releasing 10K of pages in a day. and these pages are may be url with the keywords and keywords are collected from the search query for which you are getting traffic / from keyword research and the content on this page is displayed from the existing content, then I feel there is an issue. Make sense?
If these 10K pages are pure content (original is great) pages then let me tell you.. you will be loved by Google and other search engines.
I thought MC talked about this a couple of years ago and admitted that releasing too many pages too quickly can trigger a penalty. I forget the threshold amount he mentioned...
There was that situation in 2006 where Microsoft migrated millions of pages from spaces.msn.com to spaces.live.com. Millions. Here's a tidbit from Matt that seemed to kick off a "too many new URLs" penalty fear:
|We saw so many urls suddenly showing up on spaces.live.com that it triggered a flag in our system which requires more trust in individual urls in order for them to rank... |
But it's a matter of scale here. Millions of new URLs is pretty far beyond 70,000.
|But it's a matter of scale here. Millions of new URLs is pretty far beyond 70,000. |
tedster - thanks for finding that reference. I knew the number was pretty high but I didn't want to throw out a guess. I spent about ten minutes trying to find the reference myself. I couldn't seem to figure out what term to search for... :(
|The new version we are soon to release has a big directory that has been added to it but I am unsure how best to release this? |
My concern is what these pages are. I've never heard of a "directory" of 65,000 pages created instantly unless it was a Dmoz mirror.
Adding 65,000 pages is really a game of internal structure, Pagerank and Trustrank. Of course, if that stuff if thin then no rolling release of any kind will save you from getting tanked in the SERPS.
Has the 5,000 pages a day limit from a few years ago been lifted?
I didn't watch tedster's reference to know if there's something basic and new I'm missing, but I do remember 'back-in-the-day' when we had that GoogleGuy posting there was a statement about 5,000 a day being about the max you wanted to add.
70,000+ pages is a lot of pages to add ... my concern would be whether the site has enough Page Rank strength to support that many pages effectively.
My opinion would be to add the new pages whenever they're ready, but move cautiously in terms of how you link to them from the rest of the site. You want to give the new sections enough links to make sure the spiders can find them, but you wouldn't want to spread your Page Rank too thin too fast.
Be sure to support the new content with some new link development.
In June 2008 I released 2 million new pages that were not previously open to spiders (large directory with a PR8 home page). No problems.
I think it is not an issue at all. G is intelligent enough to understand the situation.