Forum Moderators: Robert Charlton & goodroi
I run a 700 page site that has a great deal of content written by me. It is a niche site and doing fairly well. It is currently a PR5 with about 1300 backlinks from other sites. It is about one year old.
Recently I finally came out of google's "sandbox". I moved from 150 to #4 for my top targeted keyword.
I have found a way to add several thousand pages of content to my site. The content makes logical sense, and is not spam.
I have read on here before (and of course I can't find it now) that there is worry that adding too many pages at once can get you penalized. Can anyone share their thoughts on this?
I am looking at around 15,000 pages here. Should I break them up into segments as I add them? I really, really do not want to visit that stupid sandbox again.
Thanks!
Matt
Well, I don't consider it spam. I'll explain a bit more:
I have a research oriented site and what I am working on is a researchable database so my visitors are able to get further professional references (many people find my site because they are looking for references, I just want to offer more).
Of course I realize this is available on some other sites, but since I am already getting people coming to my site, I'd like them to stay for a while ;)
Matt
FWIW, between our own experience and what I've been told and/or seen from others, there's less of a problem with adding pages per se, than with triggering dup filters and related filters.
We've also repeatedly seen instances of sites changing their templates or their filenames and being sandboxed as a result.
Personally, I would:
- Keep the site template in tact when adding lots of new pages (in other words, don't launch a new redesign during this time)
- Phase in the addition of the new pages, if at all possible.
Phasing in the pages not only looks more natural to the SE's, it provides added reason to keep them 'seeing' the site as vibrant and active, which is always a good thing WRT the bots and their visiting patterns.
Make sense?
I am now trying to do things slowly ... new data coming in is produced in small batches. I find 10 pages a week is all that can be managed without seriously effecting serps. With updates to given products and new data on given products (not new pages) I still have problems.
The truth of the matter in my mind seems to be keep the inforation static and you will be fine. This is a pitty as prices of products change also reviews of products are constantly being produced. Updating and producing new seems not to be what the spiders like.
So tell me .. if a spider cant be bothered with it .. does it just sandbox it so it has less work to do? Sheesh and I thought some people were lazy :P
[webmasterworld.com...]
A real mixed bag. Some of the comments make me afraid to add more then 10 pages a week. What in the world! That is ridiculous.
Well, as I mentioned my site is around 700 pages. I uploaded an additional 110 yesterday. We'll see what happens. I'll let you know!
Matt
It sounds so stupid to me, but I know that things don't always make sense.
As I mentioned, I have several thousand pages I'd love to add, even have 1,000 all ready to go. But honestly I got a bit spooked by some of the comments in the thread. I worked so hard, and waited so long to get to where I am today. I really don't want to lose that.
I appreciate everyone's thoughts/experiences
Matt
Before: www.domain.com/default.asp?pg=products&specific=shdsa9d
Now: www.domain.com/default~pg~products~specific~shdsa9d.asp
Previously, the pages might have been indexed, but would be assigned no descriptions in the G SERP's and ranked poorly. I have seen a huge spike in traffic since applying the rewrite last weekend.
The site has been sandboxed or something since Dec.17, and with this change seems to have come out of it.
If I get dumped back into a sandbox for this I will let y'all know.
The main issue I saw with the 2500 pages was the length of time it took to get them indexed. However, as caveman notes, there appears to be a statistically relevant probability that your results will vary. Personally, I'd like to know what the exact changes that users who added large numbers of pages and then had their site drop from the rankings made, in terms of everything that was done to the site. I haven't yet heard a convincing argument that the same algo component is at work here as is in what we term the sandbox, although I also am not saying it's not possible.
My suspicion is that what is happening is caused by some other penalty trigger, which creates what appears to be a sandboxing.
I added my new section by creating one link from the main page to it. From there the page further breaks down (doesn't get too deep). I wanted to make sure it was two steps down from my main page, so I wouldn't lose much pr.
Of course I'm always pursuing links, to help balance things out.
You guys are encouraging me. I may add a bit more to my site.
Matt
You could always add all of the content and then robots.txt it out to the spiders over time. That way the users have full access but you don't run the risk of incurring a sandbox-like penalty.
AL.
Net result: in years of doing this, not one single problem with the serps. We have only been penalized once (Florida update) and then it was only one site (and we're still trying to figure out *why* we were penalized -- there was no black hat anywhere on the site).
We're about to re-launch a very high profile site, one I'd bet 30-50% of adult Americans have seen at some point or another. With this, we're introducing a new navigation system, changing all url's, introducing new content and even re-writing existing content. As usual, and key to success, we'll be using 301's to redirect a few thousand existing pages of content to the new pages and as usual, expect zero problems.
I hear all these nightmare stories on this board and others -- and I'm always wondering if its the industry or what, because we just never have those issues.
The reason for this? These content pages were taken from another site that offered the use of free articles and then recently showed no title when using the google site: mydomain.com command. When these pages were dropped from the index we seen a major drop in traffic.
However, in the last week these pages are now cached, showing titles etc and have seen a major increase back to the origonal pages of the serps. However, we don't receive a lot of traffic entry from these pages, they are purely on topic with the site.
My site has approx 200 pages, the total for all these pages are 50 (25% of my site). Do you feel changing the pages all at once is a good idea, or will it be better to phase these in also?
BTW: Happy Easter!
greg
The result was losing a top 5 position on a 30 million rated keyword which we had for nearly 2 years (now no 94) and most of the site disappearing.
I do expect it to come back
fingers crossed!
I would say we changed about 2000 pages over 2 months.
Ack..! There are a bunch of ways you can do this and avoid losing your ranking. Permanent Redirects in the header, URL rewrite, etc.
You do not have to lose your position and if you act quickly you can probably maintain your old SERP's.
add several thousand pages of content to my site + not spam = oxymoron.
I cant stand overly spam zealous know-it-alls making comments like that, only harmful for us real seo's and webmasters providing real content to the net.
To answer the origional question, I would say it all depends on several factors. First being the age of the site, if you have a new site I highly suggest adding the pages SLOOOWLY, possibly at a rate of 1k-3k per month. However if you have a aged site (3 years or so..) with thousands of pages already indexed you have alot more leg room to add bulk to your site and to get it properly indexed. I think google may have a perdetermined ratio of current pages indexed vs. amount of new pages you wish to be indexed. In other words if you already have 10k pages indexed and google allows you to grow your site at 10% each..lets say month, then adding 1k pages would not trip their spam filter. Its just a matter of determining what that threshold is and not venturing over that limit (flagged for spam). We have well over 150,000 pages indexed so far and could easily add 25,000 pages or so a month without fear of a penalty (at least thus far has been our experience). However we have also built brand new sites and immediatly added 25,000 pages only to see them dropped and immediatly penalized from the index. Keep this philosophy in mind when adding pages, and ignore those small guys who consider a site larger than 100 pages to be spam... go by Google's own demonstrated defintion of spam.
I cant stand overly spam zealous know-it-alls making comments like that, only harmful for us real seo's and webmasters providing real content to the net.
Real content? Gimme a break.
So how many writers would you employ to create 25,000 pages of "real" content per month? I am far from being a "know it all". That's why I frequent this forum, but I can do arithmetic.
It takes me at least a couple of hours to produce a page of real content. Let's say that you since you are a "real" webmaster and can do it in one quarter of that time (30 minutes). That's 6,250 hours or on an eight hour shift, 781 days, which equates to more than two years.
You can add 25,000 pages "of real content" every month and you accuse me of being a know it all?
If I were Google I would index no more than 100 pages on any website. Who needs any more than this to tell people what their site is about?
No one can debate it takes time to create content, however the world is awash in documents, not necessarily all of them have been published online before.