Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

New content - how much is too much too fast?

or does it matter?

         

StaceyJ

5:33 pm on Nov 26, 2007 (gmt 0)

10+ Year Member



I have an ecom site that gets a few new products added almost daily. Every time we add a product an additional thumbnail (sub-category) page gets added and approx. 20 or so new product pages (since the products come in various different styles and each style needs it's own product page).

Does it matter to G how much new content you add at one time, like with backlinks, or does content not work the same? We've been being very careful about how much we add day to day just in case too much is too much all at once.

The site is a few years old and about 95% of the results are in the supp index (or were when we could see them listed as such). We think most of this had to do with the original shopping cart script by default had duplicate Titles and META Descriptions throughout until we caught it. We have been slowly working our way out of the hole by manually changing all the duplicates to unique Titles and META's. Using the site: /* operator shows a few of the previous supp pages slowly coming back to the main index. We don't want to do anything to jepardize this, hence the above questions.

Thanks.

AlchemyV

5:53 pm on Nov 26, 2007 (gmt 0)

10+ Year Member



I would say its better not to shove too many pages up, as it would not look natural. Some may have different experiences, however hundreds of pages out of nowhere would look weird to a search engine and the work of deliberate SEO, causing you issues. If anyone could add a more techinical explanation, I'd welcome that.

BillyS

5:57 pm on Nov 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Matt Cutts has discussed this exact topic on his website. You can trip an alarm by adding too much content too fast (but too much is always relative when you've got billions of pages in your index). Matt explains this concept in more detail.

tedster

6:15 pm on Nov 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To get some perspective on the scale of new urls that Matt was talking about:

So this is not something that a typical site owner needs to think about or worry about if they're not adding hundreds of thousands or millions of URLs very quickly.

[seroundtable.com...]

So according to this, hundreds of new urls at once is not a problem - and especially not for ecommerce sites that routinely add new product lines.

StaceyJ

9:22 pm on Nov 26, 2007 (gmt 0)

10+ Year Member



Thanks for the info and the link (and the links within :) ). Good reading and info.

idolw

9:23 pm on Nov 26, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How about thing like that:

We are currently re-doing our site and making it more complete. We have been so far presenting widget in 500 locations in one language.
The new version will present widgets in 7000 locations and in 10 languages.
So the site will get 120 times larger.

Apart from the widgets, during the last 15 months we have developed several thousands of pretty cool content that should be of use for our visitors and bring us some more.

Although I am pretty convinced the SEs will be happy to eat our articles, the problem is about the widgets section that will grow extremely.

What do you all think about that?

ecmedia

2:18 pm on Nov 27, 2007 (gmt 0)

10+ Year Member



I think the trick is to start gradually. For instance, if your website is only 100 pages (and has not changed in month) you can start by adding 2-3 pages daily and wait G to index them. Then you can increase that number to 5-10 and then 20-30 and so on. What will be disastrous is if one fine day you drop 200 pages to a 100 page website that has not been touched in years.

jay5r

3:41 pm on Nov 27, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



I added over 160,000 pages to a site that had about 150 pages and didn't encounter any problems. Google indexed the pages quickly, and has been consistently sending traffic to the pages ever since (they are long-tail pages).

In other words, Matt Cutts was wrong. I looked up what he said when I was doing it and it was in one of his video blogs - he was answering a question and said "I wouldn't add more than say 10,000 pages at a time" (paraphrased). In other words, I don't think he was speaking from actual knowledge of how Googlebot has been coded to respond when it sees a lot of new pages - just what his guess was about best practices. (And he guessed wrong).

I should probably add that the site I added the pages to had a clean history and had been around since 1999 - that might have helped.

Yahoo! is more conservative than Google on picking up new pages. At the last SES NY one of the guys from Yahoo! said they allocate each site a maximum number of pages the site can have in the Yahoo! index and that number goes up very slowly. So far they have 22,000 of the 160,000 indexed after about 7 months.

Hope that helps...

londrum

8:38 pm on Nov 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i think when you're adding 160,000 pages, there's no way you're going to notice the drop in traffic from the original 150, as you've increased it by so much.

but --this has happened to me a couple of times-- my site only had only 200 pages and i added about 50, and the drop in traffic from the original 200 was obvious. the gain i made from the other 50 hardly dented it.

but after a month or two it all went back to normal

BillyS

10:33 pm on Nov 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think he was speaking from actual knowledge of how Googlebot

Matt's in charge of SPAM - he knows what he's talking about since he's helping write the rules.

The OP is worried about tripping a SPAM filter, not indexing.

tedster

10:51 pm on Nov 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree - googlebot's spidering is in a separate compartment from the algo's behind the scenes number crunching for ranking purposes. Even totally banned pages can still see googlebot coming around.

jay5r

11:52 pm on Nov 27, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



But my point was that I added a LOT of pages and didn't trip the spam filter... Matt Cutts didn't say don't add more than 10,000 or else you'll trip the spam filter. Instead he said it might be best to add fewer than 10,000 at a time. Big difference...

SEOPTI

2:47 am on Nov 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



jay5r, now they will read this and add it for sure :)

tedster

3:00 am on Nov 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have worked with sites who add many thousands of products every season - and they never see an issue. However, one site I know of added a new feature that allowed new urls that were different ways of sorting products on different criteria.

That new feature introduced about 900,000 urls at one time to a domain that already had about 1,200,000 urls indexed. They did have ranking problems for a while because of that url flood, so there is a limit to what Google will digest at one time without going "burp". I agree that it's not a limit that the ordinary site needs to worry about, unless they are playing auto-generated database games.

jay5r

3:43 am on Nov 28, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



It's probably worth mentioning that the 160,000 pages I added weren't spam, they were legitimate, informative pages based on data from a relational database that couldn't be added in smaller chunks (everything was interconnected). And the site's clean and relatively long history may have helped as well.

The point is that adding lots of pages doesn't automatically make you look spammy. I was actually pretty impressed that Google could tell the difference and their spam filters weren't so simplistic as to say lots of new pages = spam.

4crests

6:37 pm on Nov 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I had a similar situation as jay5r. I added 300,000 pages to a 50,000 page retail site. So far all pages are ranking very well. But, it's only been about a month now. I'm hoping they don't start dropping out. Each page did have unique content, and my site has been around since 1994. At least they are in there for the Christmas Rush, i'm very happy about that.

Crush

7:25 am on Nov 29, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Old domain, no worries, even dupe content will work. New domain might index but never will rank. It is pretty binary, 1 or 0.