Forum Moderators: Robert Charlton & goodroi
Does it matter to G how much new content you add at one time, like with backlinks, or does content not work the same? We've been being very careful about how much we add day to day just in case too much is too much all at once.
The site is a few years old and about 95% of the results are in the supp index (or were when we could see them listed as such). We think most of this had to do with the original shopping cart script by default had duplicate Titles and META Descriptions throughout until we caught it. We have been slowly working our way out of the hole by manually changing all the duplicates to unique Titles and META's. Using the site: /* operator shows a few of the previous supp pages slowly coming back to the main index. We don't want to do anything to jepardize this, hence the above questions.
Thanks.
So this is not something that a typical site owner needs to think about or worry about if they're not adding hundreds of thousands or millions of URLs very quickly.[seroundtable.com...]
So according to this, hundreds of new urls at once is not a problem - and especially not for ecommerce sites that routinely add new product lines.
We are currently re-doing our site and making it more complete. We have been so far presenting widget in 500 locations in one language.
The new version will present widgets in 7000 locations and in 10 languages.
So the site will get 120 times larger.
Apart from the widgets, during the last 15 months we have developed several thousands of pretty cool content that should be of use for our visitors and bring us some more.
Although I am pretty convinced the SEs will be happy to eat our articles, the problem is about the widgets section that will grow extremely.
What do you all think about that?
In other words, Matt Cutts was wrong. I looked up what he said when I was doing it and it was in one of his video blogs - he was answering a question and said "I wouldn't add more than say 10,000 pages at a time" (paraphrased). In other words, I don't think he was speaking from actual knowledge of how Googlebot has been coded to respond when it sees a lot of new pages - just what his guess was about best practices. (And he guessed wrong).
I should probably add that the site I added the pages to had a clean history and had been around since 1999 - that might have helped.
Yahoo! is more conservative than Google on picking up new pages. At the last SES NY one of the guys from Yahoo! said they allocate each site a maximum number of pages the site can have in the Yahoo! index and that number goes up very slowly. So far they have 22,000 of the 160,000 indexed after about 7 months.
Hope that helps...
but --this has happened to me a couple of times-- my site only had only 200 pages and i added about 50, and the drop in traffic from the original 200 was obvious. the gain i made from the other 50 hardly dented it.
but after a month or two it all went back to normal
That new feature introduced about 900,000 urls at one time to a domain that already had about 1,200,000 urls indexed. They did have ranking problems for a while because of that url flood, so there is a limit to what Google will digest at one time without going "burp". I agree that it's not a limit that the ordinary site needs to worry about, unless they are playing auto-generated database games.
The point is that adding lots of pages doesn't automatically make you look spammy. I was actually pretty impressed that Google could tell the difference and their spam filters weren't so simplistic as to say lots of new pages = spam.