homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

How best to release 70,000 pages?

 2:00 pm on Mar 25, 2010 (gmt 0)

Hi folks!

We have a jobsite that currently has around 5,000 pages. The new version we are soon to release has a big directory that has been added to it but I am unsure how best to release this? Should we do like 2,000 pages per week, per month, all 70,000 pages all in one go?

What I don't want to happen is for Google or any other SE to suddenly find all of these new pages and then penalise us for it.

Can anyone advise on this please?

Many thanks,




 3:26 pm on Mar 25, 2010 (gmt 0)

A site i work with updates with a new sub-category every once in a while. That usually amounts to 10K-50K pages in one go.

Releasing the pages at the same time has never been a problem, even a few years back when the site was very small and had few links incoming.

Getting those fancy and sought-after deep links, that's the problem... As long as you are publishing good, relevant and above all unique content you should be fine.

Are the 5K old pages also going to stay online? Or do they have a "new" home in the new version? How you choose to do this may well have an affect on your rankings - that is if you are tinkering with the pages and URLs already indexed and ranking.


 3:38 pm on Mar 25, 2010 (gmt 0)

I released 40K pages all at once about a year ago with no penalty.

Google indexed then all in about 3 days.


 4:06 pm on Mar 25, 2010 (gmt 0)

launch them and pray


 5:12 pm on Mar 25, 2010 (gmt 0)

you will not get a penalty. Just be advised that these pages probably will not rank too high since I am betting these pages do not offer any kind of unique, quality content. What I like to do with scraper/search result type scripts that generate a ton of pages is implement ajax instand edit and hire a content writer to create content for many of the pages.


 5:22 pm on Mar 25, 2010 (gmt 0)

Two months ago we launched a site with 25k pages. No penalty, indexed under a week.


 1:44 pm on Mar 26, 2010 (gmt 0)

Let me give a scenario. If you releasing 10K of pages in a day. and these pages are may be url with the keywords and keywords are collected from the search query for which you are getting traffic / from keyword research and the content on this page is displayed from the existing content, then I feel there is an issue. Make sense?

If these 10K pages are pure content (original is great) pages then let me tell you.. you will be loved by Google and other search engines.


 1:57 am on Mar 27, 2010 (gmt 0)

I thought MC talked about this a couple of years ago and admitted that releasing too many pages too quickly can trigger a penalty. I forget the threshold amount he mentioned...


 2:25 am on Mar 27, 2010 (gmt 0)

There was that situation in 2006 where Microsoft migrated millions of pages from spaces.msn.com to spaces.live.com. Millions. Here's a tidbit from Matt that seemed to kick off a "too many new URLs" penalty fear:

We saw so many urls suddenly showing up on spaces.live.com that it triggered a flag in our system which requires more trust in individual urls in order for them to rank...


But it's a matter of scale here. Millions of new URLs is pretty far beyond 70,000.


 10:43 am on Mar 27, 2010 (gmt 0)

But it's a matter of scale here. Millions of new URLs is pretty far beyond 70,000.

tedster - thanks for finding that reference. I knew the number was pretty high but I didn't want to throw out a guess. I spent about ten minutes trying to find the reference myself. I couldn't seem to figure out what term to search for... :(


 11:35 am on Mar 27, 2010 (gmt 0)

The new version we are soon to release has a big directory that has been added to it but I am unsure how best to release this?

My concern is what these pages are. I've never heard of a "directory" of 65,000 pages created instantly unless it was a Dmoz mirror.

Adding 65,000 pages is really a game of internal structure, Pagerank and Trustrank. Of course, if that stuff if thin then no rolling release of any kind will save you from getting tanked in the SERPS.


 5:31 pm on Mar 27, 2010 (gmt 0)

Has the 5,000 pages a day limit from a few years ago been lifted?

I didn't watch tedster's reference to know if there's something basic and new I'm missing, but I do remember 'back-in-the-day' when we had that GoogleGuy posting there was a statement about 5,000 a day being about the max you wanted to add.


 6:22 pm on Mar 27, 2010 (gmt 0)

70,000+ pages is a lot of pages to add ... my concern would be whether the site has enough Page Rank strength to support that many pages effectively.

My opinion would be to add the new pages whenever they're ready, but move cautiously in terms of how you link to them from the rest of the site. You want to give the new sections enough links to make sure the spiders can find them, but you wouldn't want to spread your Page Rank too thin too fast.

Be sure to support the new content with some new link development.


 5:49 am on Apr 1, 2010 (gmt 0)

In June 2008 I released 2 million new pages that were not previously open to spiders (large directory with a PR8 home page). No problems.


 6:29 am on Apr 1, 2010 (gmt 0)

I think it is not an issue at all. G is intelligent enough to understand the situation.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved