add several thousand pages of content to my site + not spam = oxymoron.
Well, I don't consider it spam. I'll explain a bit more:
I have a research oriented site and what I am working on is a researchable database so my visitors are able to get further professional references (many people find my site because they are looking for references, I just want to offer more).
Of course I realize this is available on some other sites, but since I am already getting people coming to my site, I'd like them to stay for a while ;)
mbush27 you ask good questions, and of course it's possible to add large numbers of valuable pages that are not spam.
FWIW, between our own experience and what I've been told and/or seen from others, there's less of a problem with adding pages per se, than with triggering dup filters and related filters.
We've also repeatedly seen instances of sites changing their templates or their filenames and being sandboxed as a result.
Personally, I would:
- Keep the site template in tact when adding lots of new pages (in other words, don't launch a new redesign during this time)
- Phase in the addition of the new pages, if at all possible.
Phasing in the pages not only looks more natural to the SE's, it provides added reason to keep them 'seeing' the site as vibrant and active, which is always a good thing WRT the bots and their visiting patterns.
Ugh! The sandbox nasty. I have had this problem myself with adding many painstakingly produced pages at one time. It is hard to know what to do tbh.
I am now trying to do things slowly ... new data coming in is produced in small batches. I find 10 pages a week is all that can be managed without seriously effecting serps. With updates to given products and new data on given products (not new pages) I still have problems.
The truth of the matter in my mind seems to be keep the inforation static and you will be fine. This is a pitty as prices of products change also reviews of products are constantly being produced. Updating and producing new seems not to be what the spiders like.
So tell me .. if a spider cant be bothered with it .. does it just sandbox it so it has less work to do? Sheesh and I thought some people were lazy :P
If you phase in the pages, as Caveman suggested, you might want to announce the coming pages/sections to tweak your readers interest and give them a reason to come back again later.
"Coming next week/month:....."
I was able to eventually find this thread:
A real mixed bag. Some of the comments make me afraid to add more then 10 pages a week. What in the world! That is ridiculous.
Well, as I mentioned my site is around 700 pages. I uploaded an additional 110 yesterday. We'll see what happens. I'll let you know!
A month ago I added an extra 1,000 pages to a site that only has 25 pages before. A week later, traffic had quadrupled. Google certainly didn't penalise me for it!
Somehow it just makes no sense to me that new real content would be penalized. Especially since the SE's seem to prefer content that is updated fairly often.
Yes, I completely agree. What about sites that make a change that is sitewide? Changing 700 pages at once, would that push you into the sandbox?
It sounds so stupid to me, but I know that things don't always make sense.
As I mentioned, I have several thousand pages I'd love to add, even have 1,000 all ready to go. But honestly I got a bit spooked by some of the comments in the thread. I worked so hard, and waited so long to get to where I am today. I really don't want to lose that.
I appreciate everyone's thoughts/experiences
I don't know what the threshold is (or even if there is one), but I add 10-15 new pages every day with no ill effects.
I just added a URL rewrite a week a ago to my ASP storefront dynamic pages, so now my 750 product pages are getting gobbled up and indexed by Google.
Previously, the pages might have been indexed, but would be assigned no descriptions in the G SERP's and ranked poorly. I have seen a huge spike in traffic since applying the rewrite last weekend.
The site has been sandboxed or something since Dec.17, and with this change seems to have come out of it.
If I get dumped back into a sandbox for this I will let y'all know.
My experience is the same as mrMister's, one site added about 2500 pages, 10 times the original number, site was ranking for those terms within a few weeks. Sorry, beedee, in this case not spam, just a new section. Another site we've added 60k pages, also ranks highly for its terms, competitive terms. Both sites are in highly competitive industries.
The main issue I saw with the 2500 pages was the length of time it took to get them indexed. However, as caveman notes, there appears to be a statistically relevant probability that your results will vary. Personally, I'd like to know what the exact changes that users who added large numbers of pages and then had their site drop from the rankings made, in terms of everything that was done to the site. I haven't yet heard a convincing argument that the same algo component is at work here as is in what we term the sandbox, although I also am not saying it's not possible.
My suspicion is that what is happening is caused by some other penalty trigger, which creates what appears to be a sandboxing.
My main page is a strong pr5, and from that page, I link to 8 other main categories. Now I could see if someone were to add another 100 links to that main page how the pr would get sucked out rather quickly. Maybe this is happening to people?
I added my new section by creating one link from the main page to it. From there the page further breaks down (doesn't get too deep). I wanted to make sure it was two steps down from my main page, so I wouldn't lose much pr.
Of course I'm always pursuing links, to help balance things out.
You guys are encouraging me. I may add a bit more to my site.
We add 20+ pages and modify probably close to a hundred every single day and haven't been penalized once in the 5 years we've been up and running. I do see how you could be concerned though. In fact, we're actually looking at adding probably 3k pages of new content all in one big tweak (also not spam incidentally) and I'm not too worried about it.
You could always add all of the content and then robots.txt it out to the spiders over time. That way the users have full access but you don't run the risk of incurring a sandbox-like penalty.
Our company regularly takes over existing, old, sites. First thing we do is completely change the linking structure and site layout. First thing we notice is a significant increase in spidering within the first 2-3 weeks and pretty significant increases in traffic. Our next phase is to pump up the site with hundreds of pages of hand-written content (which we pay a pretty penny for) and put that up. Last phase is to plug in other content databases that are updated on a regular basis (which overnight can quadruple the number of pages on the site).
Net result: in years of doing this, not one single problem with the serps. We have only been penalized once (Florida update) and then it was only one site (and we're still trying to figure out *why* we were penalized -- there was no black hat anywhere on the site).
We're about to re-launch a very high profile site, one I'd bet 30-50% of adult Americans have seen at some point or another. With this, we're introducing a new navigation system, changing all url's, introducing new content and even re-writing existing content. As usual, and key to success, we'll be using 301's to redirect a few thousand existing pages of content to the new pages and as usual, expect zero problems.
I hear all these nightmare stories on this board and others -- and I'm always wondering if its the industry or what, because we just never have those issues.
The only problem I have seen with adding pages in volume is if you are not very careful mistakes can occur in volume. Make sure ALL pages are valid code and will not trip dup/spam or other filters. When you add so many pages it is easy to miss something. If you add 1,000s of pages with "errors" they will be spidered within the hour. If you upload the same pages without errors they may take weeks to be spidered. LOL
I am in the process of updating my main site and am have employed someone to rewrite pages that were added when the site was first launched about 18 months ago.
The reason for this? These content pages were taken from another site that offered the use of free articles and then recently showed no title when using the google site: mydomain.com command. When these pages were dropped from the index we seen a major drop in traffic.
However, in the last week these pages are now cached, showing titles etc and have seen a major increase back to the origonal pages of the serps. However, we don't receive a lot of traffic entry from these pages, they are purely on topic with the site.
My site has approx 200 pages, the total for all these pages are 50 (25% of my site). Do you feel changing the pages all at once is a good idea, or will it be better to phase these in also?
|Our company regularly takes over existing, old, sites. First thing we do is completely change the linking structure and site layout |
Do you then remove all the old content and replace it with useless tat from your main site?
If so... you work for Yahoo! and I claim my five pounds ;-)
I think it depends on the mass of content which was on the site before you start adding new content.
For example I addded about 7000 pages to a website which had about 300 before. The site is no longer shown anywhere, so Ithink that was too much.
On the other hand I added 2000 pages to a site consisting of about 25000 pages and it is still ranking very well.
BTW: Happy Easter!
We have just changed the major part of our site from php to html and we used the same urls, titles and text.
The result was losing a top 5 position on a 30 million rated keyword which we had for nearly 2 years (now no 94) and most of the site disappearing.
I do expect it to come back
I would say we changed about 2000 pages over 2 months.
"We have just changed the major part of our site from php to html and we used the same urls, titles and text. "
Ack..! There are a bunch of ways you can do this and avoid losing your ranking. Permanent Redirects in the header, URL rewrite, etc.
You do not have to lose your position and if you act quickly you can probably maintain your old SERP's.
There is nothing to rewrite, I never changed my URL's
I used .htaccess so my pages were '.htm' and not '.php' unless I am missing something. Explain more...
Then I would rule out your change as the cause of the drop. Gbot is not pyschic and can not know you have done a rewrite. Some other factor must be involved, or, you are simply another victim of the random axe of Google.
|"We have just changed the major part of our site from php to html and we used the same urls, titles and text. " |
If you changed the extension from .php to .html then you DID change the URLS. All of them.
|add several thousand pages of content to my site + not spam = oxymoron. |
I cant stand overly spam zealous know-it-alls making comments like that, only harmful for us real seo's and webmasters providing real content to the net.
To answer the origional question, I would say it all depends on several factors. First being the age of the site, if you have a new site I highly suggest adding the pages SLOOOWLY, possibly at a rate of 1k-3k per month. However if you have a aged site (3 years or so..) with thousands of pages already indexed you have alot more leg room to add bulk to your site and to get it properly indexed. I think google may have a perdetermined ratio of current pages indexed vs. amount of new pages you wish to be indexed. In other words if you already have 10k pages indexed and google allows you to grow your site at 10% each..lets say month, then adding 1k pages would not trip their spam filter. Its just a matter of determining what that threshold is and not venturing over that limit (flagged for spam). We have well over 150,000 pages indexed so far and could easily add 25,000 pages or so a month without fear of a penalty (at least thus far has been our experience). However we have also built brand new sites and immediatly added 25,000 pages only to see them dropped and immediatly penalized from the index. Keep this philosophy in mind when adding pages, and ignore those small guys who consider a site larger than 100 pages to be spam... go by Google's own demonstrated defintion of spam.
The BBC has 2,670,000 pages listed and its good stuff. They have a lot of editorial writers. I agree that when I hear about one person and his dog generating 1000's of pages it must be suspect, but some topics and some companies can generate good original content so we should not pre judge.
|I cant stand overly spam zealous know-it-alls making comments like that, only harmful for us real seo's and webmasters providing real content to the net. |
Real content? Gimme a break.
So how many writers would you employ to create 25,000 pages of "real" content per month? I am far from being a "know it all". That's why I frequent this forum, but I can do arithmetic.
It takes me at least a couple of hours to produce a page of real content. Let's say that you since you are a "real" webmaster and can do it in one quarter of that time (30 minutes). That's 6,250 hours or on an eight hour shift, 781 days, which equates to more than two years.
You can add 25,000 pages "of real content" every month and you accuse me of being a know it all?
If I were Google I would index no more than 100 pages on any website. Who needs any more than this to tell people what their site is about?
You could be a firm that has thousands of documents, and decides to publish them all on the net for the first time. Papers on 'thermodynamics of widget lubrication' or something. It's important to someone out there.
No one can debate it takes time to create content, however the world is awash in documents, not necessarily all of them have been published online before.
Its not "what sites are about", thatís for beginner mom and pop sites, the real power of the internet (and Google) is the ability to index massive amounts of INFORMATION. If happy spam crusaders like yourself wrote a dictionary it would likely be the size of a magazine. Honestly I donít care enough to even entertain this discussion further as I've made a real attempt to answer the question posed, rather than making a whimsical ethical appeal.
| This 34 message thread spans 2 pages: 34 (  2 ) > > |