homepage Welcome to WebmasterWorld Guest from 54.196.159.11
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Can Adding a Large Number of Pages Make Your Rankings Suffer?
aok88




msg:3658821
 10:34 pm on May 25, 2008 (gmt 0)

We've been launching numerous sites lately and a few of them had lots of pages and we added them all at the same time. For one site for example, we had a domain that we decided to expand from a few hundred pages to 12,000 new pages and added them all at once.

I'm fairly sure that Google likes to see links and pages added in a natural way, which would mean spread out. So posting 12k new pages to a 200-page site probably doesn't look natural to Google, so rankings may take a while to come or just simply suffer for a while. I am waiting to see since we just did this last week.

But, I was wondering if anyone else had experiences with this. Has anyone done something similar and was loved by Google out of the gate? Or is there anyone who added lots of pages find that your rankings for the new pages suffered for along time?

 

SEOPTI




msg:3658880
 1:02 am on May 26, 2008 (gmt 0)

Rankings may suffer due to the large number of phrases causing a -950 penalty (co-occurance) but I'm quite sure rankings will not suffer due to the number of URLs.

[edited by: SEOPTI at 1:03 am (utc) on May 26, 2008]

tedster




msg:3658886
 1:30 am on May 26, 2008 (gmt 0)

I'm fairly sure that Google likes to see links and pages added in a natural way, which would mean spread out.

Yes, that's one kind of "natural". But it's also a common practice to develop new site areas in a test environment and launch the new area all at once.

Depending on Google's previous spidering priorities for a domain, going from 200 pages to 12K in one jump could mean relatively slow spidering, too. Google's crawl team has their own logic for how to allocate spidering resources.

kidder




msg:3658901
 2:28 am on May 26, 2008 (gmt 0)

Interesting topic as we've just opened up a previously "closed" section on a forum which was a few thousand threads. It's a pretty strong trusted domain and the new pages started to get indexed and ranked within hours.

aok88




msg:3659093
 11:03 am on May 26, 2008 (gmt 0)

With this 12k-page site, I noticed a few things with this new site. A couple of days later a few hundred pages were indexed by Google and they all ranked unbelievably well. Then Google indexed about 900 pages so far over the last 9 days and the rankings are dropping like a rock.

This reminds me of when you launch a new site and get great rankings for a bit, then lose the rankings after a little while. But this site of mine is not new - it's over a year old and has a PR 3. Any ideas what's going on?

pageoneresults




msg:3659118
 12:19 pm on May 26, 2008 (gmt 0)

For one site for example, we had a domain that we decided to expand from a few hundred pages to 12,000 new pages and added them all at once.

Let's think about this. If the domain was a PR4 to start with at 200 pages and you add 11,800 more to the mix, what happens with that PR? Would you think that it gets "sucked" from the pages where it resides? And if so, how much of it gets "sucked" into those pages would be a determining factor I think.

Its not common for a site to go from 200 to 12,000 pages. When that happens, its usually because someone has found a way to automate the generation of content or they've added a store or any number of things. In Google's eyes, I think that would raise a big flag. Any site doubling its page count I think would be on the radar. You've more than doubled in your case.

And, if those pages can be classified as stubs, that makes it worse. Google will let them perform first time around. But, after that second pass, they are typically purged never to be seen again until such time that they break the "stub" barrier.

I going to say YES, adding this many pages to a site's existing profile is going to cause challenges in the beginning. You're now splitting PR across 12,000 pages instead of 200. I would think that how you've "funneled" that PR to those new pages will also determine how they perform out of the gate and into the future.

aok88




msg:3659129
 1:05 pm on May 26, 2008 (gmt 0)

What's a stub? And is there a way to determine if Google considers some of your pages as stubs?

pageoneresults




msg:3659135
 1:13 pm on May 26, 2008 (gmt 0)

Stubs are pages that are typically void of any real content. They contain mostly navigational and advertising elements. A common example of stubs found online are directory pages. Many directory owners will dump an untold number of stubs into the index (they only contain one listing or no listings) in hopes of those pages performing for something, somewhere. Google caught onto this years ago and will typically "let" a stub perform out of the gate. But, once it starts to really calculate that page (on the second pass), it will typically end up in the depths of The Gorg not to be seen again.

Stubs are basically "very thin" pages. They have little to no original content. You know, like 95% template and 5% content, or less. Don't quote me on those numbers. ;)

Robert Charlton




msg:3659554
 3:15 am on May 27, 2008 (gmt 0)

I know it's unfashionable to think that PageRank matters, so I'm glad that pageone has already mentioned it. IMO, I'm surprised that a PR3 site could have supported 200 pages, let alone 12,000.

But a lot depends on the navigation structure of the site. For a given root level PR value, not factoring in deep links, you can only go so wide or so deep. If, in adding your new pages, you went very wide... ie, added many links from the home page... the negative effects on the existing site are going to be worse than if you'd kept things narrow and deep. With narrow and deep, you could probably just forget the new pages... but they'd at least be less likely to dissipate the link juice you had flowing to your old pages.

Also, I'd bet that pageone is right in thinking that your new pages are most likely thin on original content. Look at it this way... if your site is about year old, you'd need to be creating new content at the rate of 1,000 pages a month. I realize you can create many thousands of pages at the push of a button, but it's hard to generate compelling content to fill those pages at anything close to that rate.

Even if this were a catalog and you were selling parts by number, you'd have a hard time filling it with unique content.

We've been launching numerous sites lately...

With numerous sites, the problem is proportionately harder.

kidder




msg:3659629
 6:58 am on May 27, 2008 (gmt 0)

In this case more content can in fact mean less traffic, even if you have the unique content you don't have the juice to float the pages deep or wide - correct? In the short term loads of good content on it's own is not enough.

idolw




msg:3660141
 6:27 pm on May 27, 2008 (gmt 0)

Here's my example:
Currently the site features 500 locations on my continent in english.

We've been working on a new version of this site since mid 2006. Works include <a lot of new content> as well as internal CMS that we built for it. The new version will feature 6600 locations and will be presented in 8-12 languages on Day 1.

So, we are at a point where we will increase the website size by 100 times (been working on it for almost 2 years and that's a team of people!) and should we be scared?

What'd be your strategy to not lose the great rankings that we have for the current version of the website?

[edited by: tedster at 6:35 pm (utc) on May 27, 2008]

pageoneresults




msg:3660157
 6:52 pm on May 27, 2008 (gmt 0)

What'd be your strategy to not lose the great rankings that we have for the current version of the website?

Some people make big bucks answering that type of question. :)

One of the first areas I'd be looking at is how you've set up the taxonomy and drill down of the categories. I'd be looking at how links from the "existing pages" are being used to "funnel the juice" as they say. For example, if you take the home page and its had 20 primary cat links for most of its history, I'd be very careful not to upset the balance here. And, I'd carry that same caution to those 20 primary category index pages. Those are typically your "top level" pages. That's where the PageRankô typically starts. If you add 30 more links to your original 20, that is a total of 50 links that PR now needs to be distributed to.

If the PR of the top level categories is sufficient, you may have some breathing room. But, if those top level pages don't have the PR to support the trickle down to the volume of pages being added, I believe that is where the challenges really begin. You only have so much PR, how you "spread it" will be a determining factor.

I might consider noindexing those pages that are not considered a major part of the process. Do not "nofollow" them as this will upset the click paths. Take a look at all those pages where the visitor "needs to be" and figure out a way to get those to the upper levels of the click path so they are the ones attracting the available PR, not the intermediary pages. Don't mess with anything that is out of the norm such as rel="nofollow", etc. You can use noindex (meta robots) to your liking or you can use other methods such as IP based delivery.

Just remember, in Google's instance, it really is all about PageRankô. I typically try not use that word in my writings, the hair on my neck rises. A website with a PR3 for its home page is going to have a very difficult time funneling PR horizontally, vertically and diagonally. With low PR, you have to think more about the horizontal/vertical and leave the diagonal for later. :)

idolw




msg:3660263
 8:25 pm on May 27, 2008 (gmt 0)

Thanks for your effort pageonresults.

The new version of the site will include several new sections. I might consider using noindex in META for parts of these and slowly remove the noindex tags every X days/weeks.
The site is PR6 currently with around 500,000 backlinks to multiple pages within it as Yahoo! Siteexplorer reports. In order to increase PR I can think of 301ing a few domains to new sections' home pages in order to provide some juice flow. Is that a good idea?

And maybe both actions would be a good idea, actually?

pageoneresults




msg:3660270
 8:33 pm on May 27, 2008 (gmt 0)

The new version of the site will include several new sections. I might consider using noindex in META for parts of these and slowly remove the noindex tags every X days/weeks.

What you may want to consider is noindexing the middle parts of the click path. That is usually where the PR waste factor occurs. Its also all relative to how the taxonomy of the site is set up too. In your case though, PR6 represents health to me. Actually, anything with visible PR these days represents health. The page has been indexed, it has been calculated and it has passed all the "normal" routines to "join the club".

The site is PR6 currently with around 500,000 backlinks to multiple pages within it as Yahoo! Siteexplorer reports.

Ah, a larger site. A half million backlinks is nothing to sneeze at. And, that's a lot of power to have at your disposal. Think of yourself as the Conductor of a fine tuned symphony. Take that incoming "juice" and "direct" it to where it needs to be. Skip all the bouncing and drilldown routines and just put it directly at the target. Am I making sense?

In order to increase PR I can think of 301ing a few domains to new sections' home pages in order to provide some juice flow. Is that a good idea?

Yikes! Karma. You've got a PR6 with half a million backlinks, don't even think about it. ;)

idolw




msg:3660278
 8:44 pm on May 27, 2008 (gmt 0)

So again, we end up in a situation where we should not care about changes if we look after the site.
Sites are like kids. If grown up in a good manner parents do not need to worry if kids change their plans, dreams, etc. ;-)
This is actually the very good think Google is forcing us to. Work on the site and it will pay in the long run.

Anyway, thanks for all your great help, pageonresults.

aok88




msg:3660292
 9:03 pm on May 27, 2008 (gmt 0)

I love this forum, I usually get really good info and people are very up front, which is good. Good luck with your site idolw.

So I have another site that I am about to launch with new pages. I have 12k pages waiting in the wings, but this time I am not going to post them all at once. This site is also a PR3, and I am not going to keep many of the existing pages, and there weren't that many anyways. I am keeping the homepage, site map and a handful of others that have PR's of 1. The site is 5 years old and has a decent backlink profile.

So, let's say I want to make things look as natural as possible and post pages at a slower rate, so not to alert Google at all. Meanwhile, I'll try to make new links to the site and deep links too.

What would you say is a good drip rate? I would think maybe 1-4 a day would be ideal. But I would rather batch them together. If I did 400 a week for example, what do people here think that would do? Would that garner more of G's love and rankings? Any predictions?

idolw




msg:3660302
 9:12 pm on May 27, 2008 (gmt 0)

I am not going to keep any of the existing pages

you want to erase them? delete? why?

And why not to follow the advice from above?
I might consider noindexing those pages that are not considered a major part of the process. Do not "nofollow" them as this will upset the click paths. Take a look at all those pages where the visitor "needs to be" and figure out a way to get those to the upper levels of the click path so they are the ones attracting the available PR, not the intermediary pages. Don't mess with anything that is out of the norm such as rel="nofollow", etc. You can use noindex (meta robots) to your liking or you can use other methods such as IP based delivery.

So how about doing this and starting links campaign? spend a fe bucks on press releases and get some links from news sites in your niche. If new sections on new site, add link to these new sections in the press prelease. That will give you even a few hundred healthy links within a week and thus give you some more PR that shoud help you get more pages indexed (If I understood pageoneresults's advice correctly).
If you see them, unlock some more page and get links to them. Make sure you get links to subpages and not to homepage only.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved