Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How does one start a new large site?

say a widget directory with 10,000 of pages

         

walkman

8:14 pm on Jan 1, 2008 (gmt 0)



OK, ideally one starts with 10-20 pages and adds to them daily. But, suppose I have a database of widgets and want people to rate /comment on them. I will also show related widgets, widgets centers near them etc. Suppose that I will have 10,000 pages to start...all database based.

How does one introduce them to google? I have thought to ban certain categories and let goog see one a month or so so they don't flip out. Any other ideas? Starting with very few pages for users makes no sense in my case so I have to adjust for google.

Thanks.

tedster

9:51 pm on Jan 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've helped launch several sites of this size or even larger. As you point out, the site often only "makes sense" when presented as a whole. We always launched allowing goolebot full access and we did not have any troubles.

Except, of course, the sandbox-like troubles that any new site faces. To get rolling on Google in an acceptable period of time, it is essential that you move ahead immediately at launch with some solid marketing plans, so you begin to garner a balanced backlink profile over time.

One issue that can sink a new site is launching with duplicate url troubles, so make sure you test your url scheme and ensure that it's solid as a rock. But 10,000 urls of unique content at launch has not caused troubles for the sites I worked with. Google just chewed through the new urls according to its own logic, and I always allowed that to happen naturally, without any overt control.

If your database queries allow for different kinds of "sorting" of the data, or other kinds of AJAX-y bells and whistles, it's good to restrict that type of url from spidering, both at launch and into the future. It can turn 10,000 intended pages into millions of near duplicates - and that can spell trouble.

walkman

2:28 am on Jan 2, 2008 (gmt 0)



Thanks tedster.
I will be generous in using /robots.txt. I guess a large number of pages will results in a manual review.

SEOPTI

3:10 am on Jan 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you are using MySQL try to use 'GROUP BY' for your results instead of 'ORDER BY', this way you will get rid of identical rows.

Whitey

8:46 am on Jan 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've seen two new sites launched concurrently on seperate regional SERP's. Both were affiliates and shared content with a provider. They were both around 2,500 pages.

One launched and ranked in a couple of months, both on regional and global search. The other is still "sandboxed" on both. The only difference that i can see is that one had very few established links that were 301 re directed from a pre existing site .

From this inconclusive report, all i can suggest is that established links can play a part in Google delivering trust to a new site launch and could potentially accelerate the time to produce results, plus different regional SERP's may have less aggressive filtering than some others.

[edited by: Whitey at 8:49 am (utc) on Jan. 7, 2008]