Welcome to WebmasterWorld Guest from 188.8.131.52
disjointed thinking, adhoc development and labyrinthine navagation
A place for everything, everything in its place.
<meta name="robots" content="noarchive">
<meta name="robots" content="noindex">
<meta name="robots" content="noindex, nofollow">
Are there really that many differences when dealing with 1,000 documents or 1,000,000,000?
[edited by: Whitey at 11:31 pm (utc) on Dec 6, 2010]
<meta name="robots" content="noarchive"> The above is mandatory for all sites that we do. > Why?
Are there really that many differences when dealing with 1,000 documents or 1,000,000,000
p1r, sounds like you've never tried to move a billion documents before.
Trust me, there is a big difference.
How about checking for link-rot on 1,000 pages vs. 1,000,000,000 pages? Most methods which work on a 1,000 page site would take a full year on a billion page site.
Analytics only allows ads on 20,000 pages to be tracked. That's only 0.002% of a billion page site.
My guess is that most mega sites (like mine) are comprised mostly of user generated content. There are a whole host of issues that this introduces which can easily be monitored on a 1,000 page site, and not so easily on a million page site, much less a billion page site.
With mega SEO it's about making small strides over time that [when] grouped together they have a really big impact.
You're the first person to mention 1000000000 in this thread, pageone.
Have you dealt with a site that big, or were you just typing in lots of zeroes?
My experience thus far is that a huge number of indexed urls, without serious link power and Search engine recognised authority = really poor rankings
The index page contains... links only to these specific departments.