|Removing 5000 pages of unique content - Good or bad?|
| 2:54 pm on Dec 18, 2010 (gmt 0)|
I'm about to take my 5000 page of unique content down to about 20 core pages. This is being done for technical reasons, my infrastructure is groaning under the weight.
I've maintained that the content helps my rankings. My wife maintains it hurts my rankings, having 5K of secondary pages.
Who's right? Will my rankings increase, decrease, or stay the same? How long until I see any changes?
A marriage hangs in the balance! (not really :) ).
| 5:32 pm on Dec 18, 2010 (gmt 0)|
I guess the first question that has to be asked is: What kind of search traffic do those 5000 pages attract? Even if only a couple of referrals on average I might be reluctant to take them down.
Then, would the 20 core pages stand on their own to attract referrals? Are they somehow supported by the 5K?
Drastic step, big decision.
| 6:00 pm on Dec 18, 2010 (gmt 0)|
Thats a big change! I have always through in terms of the bigger your net the more fish you catch. More pages = small number of referrals per page equates to pretty large numbers in terms of combined traffic.
Are you thinking that by having a lot of pages similar in topic, but unique in content is making it harder for any one page to gain really high ranks?
I think you have a dilemma, few pages that rank well.. or lots of pages that rank low. High rank pages gain more referrals, lower ranked pages gain fewer referrals, yet there are lots of them.
I wouldn't like to have to make that call.
| 6:00 pm on Dec 18, 2010 (gmt 0)|
I'd also check for backlinks to those 5000 pages. You might be giving up a lot of link juice.
| 5:55 am on Dec 19, 2010 (gmt 0)|
Wheel, I guess the 5000 you are talking about is among the millions of pages you have...and I also assume that you get very minimal or nil traffc to those pages.But if they do have substantial links of their own, I will not loose them and they cannot be hurting your rankings.
Even otherwise, I am still not getting why should they hurt your rankings, though they are secondary.
Moreover, if you are getting minimal traffic to those pages, I am not sure why they should contribute to the load.
As for the options I do think loudly about two, though I would like other experts to comment on them.
1)301 all of them to the 20 odd pages - There will be some loss of link juice but a 301 of so many pages to a few shouldn't cause a problem in my opinion as they are all on the same domain.So, Google shouldn't be suspicious on them
2) spin off a subdomain for those pages and host them on another server.
| 7:46 am on Dec 19, 2010 (gmt 0)|
I say it will help. Less pages meaning more link juice to be spread upon the pages that stay.
I have never downsized like this so dont put too much stock into my theory. My feeling is that the site will put more emphasis on the remaining pages so they will get a boost from not having to share the spotlight with thousands of other pages.
Goodluck and let us know your findings.
| 9:57 am on Dec 19, 2010 (gmt 0)|
|I also assume that you get very minimal or nil traffc to those pages. |
That's a reasonable assumption. Either that or they don't earn income.
If it's an issue of traffic, could it be a matter of how the hub structure was initially set up? The following is a simplification to illustrate the concept. The point I'm trying to make is that it's important to break apart the site into segments and build links to the uppper segements and the sub-segments. If necessary, create sub-sub-segments and build links down to those. The idea is to create a structure that is friendly toward building links to the increasingly specific hubs.
|Red Widgets - 1,000 pages |
Orange Widgets - 1,000 pages
Yellow Widgets - 1,000 pages
Green Widgets - 1,000 pages
Blue Widgets - 1,000 pages
The above structure creates a situation that limits the ability to build links to five hubs: Red, Orange, Yellow, Green, and Blue. The link juice trickles down from there (plus the home page and scattered pages). The above structure makes sense when a site is smaller as well as for presenting the least amount of choices for a site visitor. But as a site grows it may reach a point when it becomes important to construct a new taxonomy, creating finer classifications.
This means treating each hub/section (or in this case, color) as a separate website and building the architecture as if each section were it's own mini-site. Then build links to each of the hubs, with the primary color being the main hub, and the sub-sections being mini-hubs to which you can build more links to.
|Red Widgets |
Red A widgets - 200
Red B Widgets - 200
Red C Widgets - 200
Red D Widgets - 200
Red E Widgets - 200
Orange A widgets - 200
Orange B Widgets - 200
Orange C Widgets - 200
Orange D Widgets - 200
Orange E Widgets - 200
Yellow A widgets - 200
Yellow B Widgets - 200
Yellow C Widgets - 200
Yellow D Widgets - 200
Yellow E Widgets - 200
Green A widgets - 200
Green B Widgets - 200
Green C Widgets - 200
Green D Widgets - 200
Green E Widgets - 200
Blue A widgets - 200
Blue B Widgets - 200
Blue C Widgets - 200
Blue D Widgets - 200
Blue E Widgets - 200
The above site structure creates the opportunity to build links to thirty hubs. That's like having thirty websites working for you. Only instead of being thirty sites working independently, they're working together.
The distribution of PageRank is more precise, focused, and efficient. The home page PageRank might be lower but the longtail ranking ability and the shorter keyword ranking ability should be increased. As one of my sites increases I think about how I can segment it into smaller hubs. At a certain point there are enough pages to support a distinct hub. That's when I split a section and create a new hub. It works for the site visitor because it makes finding information easier and it works for link building and ranking, too. Often, what interests people can generally be drilled down to a more narrow niche focus.
So for the case of Red Widgets, you bring links to the Master Hub, Red Widgets. But then bring links to the sub-sections, A, B, C, D, and E. All the red sub-sections must interlink to each of the other Red sub-sections, as well as to the Master Hub (Red Widgets). All pages throughout the site feature a link to the other Master Hubs, the home page, plus other ancilliary pages (about, contact, etc.).
| 4:23 pm on Dec 19, 2010 (gmt 0)|
I'm doing this because Wordpress won't accomodate a site with this many pages - technical issues with the CMS. I was going to move CMS so I can still admin my site, then figured what the heck, for now I'll dump the content and stick to my core money pages.
The pages are part of a linkbait project to drive links to my homepage, and I'm not looking for more links right now. There's likely no links to those 5K pages.
The pages do get traffic, but nothing that monetizes. In fact the traffic to those pages generates plenty of service calls that are related to my niche but non-buying customers. I try to look after those folks, but if they're not calling me anymore that's fine too. At one time I had believed that 5000 pages of long tail terms would help - but it hasn't from a buying traffic perspective. I still get almost all of my money traffic from a short list of search terms.
I think the test will be if the more focused page rank matters more, or the additional content matters more. I don't know enough to say either way...but it should be interesting.
| 5:14 pm on Dec 19, 2010 (gmt 0)|
|Less pages meaning more link juice to be spread upon the pages that stay |
It depends on the linking patterns (internal and external) and not on the number of pages. Sometimes dropping a lot of pages can disrupt circulation of link juice and cause ranking problems that way.
|At one time I had believed that 5000 pages of long tail terms would help - but it hasn't from a buying traffic perspective. |
I understand that motive. Hope you'll keep us all in the loop on your results.
| 11:12 pm on Jan 8, 2011 (gmt 0)|
Wheel, any update on how the experiment's going?
| 11:28 pm on Jan 8, 2011 (gmt 0)|
Just dropped it this week. Willkeep you posted.
| 2:48 am on Jan 9, 2011 (gmt 0)|
Likely a loss in my experience. Those pages have weight now, and the loss of internal linking alone is going to create, after some time, drops in both short and long tail phrases.
| 5:58 pm on Jan 9, 2011 (gmt 0)|
Perhaps. I've never noticed a big 'pop' though when adding these pages. In fact, there's some concern that since these are non-commercial pages, they are dragging down the commercial pages. That's what I'm wondering. I suspect it'll actually make little difference - but we're going to see.
Loos of traffic to those pages means little. They traffic they brought in was non-buying service related stuff. People calling me and looking for manufacturer's phone numbers, or worse, assuming I am the manufacturer and looking for service (for products they didn't buy from me). I won't miss that.
One of the things about this project is that it's hell to get all that content online in a clean, spiderable format. Since I've removed it from my main site, I think I'm going to put this and similiar content on a seperate domain. Let it draw links naturally, or not, and just use those other sites as feeder links.
| 10:54 pm on Jan 14, 2011 (gmt 0)|
Well, my rankings have inched up slightly but noticeably.
Of course there's no telling if it's correlated positively. But it looks like so far at least it didn't hurt.
I've got a friend who's got a similiar site. He said he'd wait and see, but if this sticks, in a few months he may consider doing the same thing. If he sees the same results, that'd be real interesting.
Too bad if it's true. 10's of thousands of pages of unique (and reasonably valuable) content potentially removed and Google reacts positively. Smells like a flaw in the algo right there. Like maybe there should be a bonus if you've got large volumes of unique content.
| 11:12 pm on Jan 14, 2011 (gmt 0)|
|I say it will help. Less pages meaning more link juice to be spread upon the pages that stay. |
I'm not commenting on whether this will hurt or help overall in the long run, it'll be very interesting to see, and of course there are several components to link juice now, but just in terms of pure PR the math doesn't necessarily work out that way.
Suppose that all of these supporting pages link back either to the homepage or to the main 20 pages or both. Then yeah, by linking to all of those subpages, the main pages are losing juice, but by having links _from_ all of those subpages they're gaining juice back. Also, each page is a source of a small amount of PR just for existing, and with 5K pages that adds up. On the other hand, each link loses juice because of the damping factor, so having all of those links results in some loss.
I don't know eactly what the damping factor or the source value are, but my guess is that the math is balanced so that, all other things being equal, it works out roughly the same as far as the PR of the main pages are concerned.
More than likely, any difference in the ranking of the main pages will be caused more by issues of theming and/or other link-graph-related factors, but not by a significant change in raw PR.
| 7:11 am on Jan 15, 2011 (gmt 0)|
Eh, I think it was a bad idea from a marketing POV...
You just through out all the visitors to that section and along with them went the brand name recognition, the conversation about the site and/or services offered, the new people who will find out about it through that section and tell someone else, and the people they might tell who are actually buying customers, etc.
You may not think it's much because you had to answer the phones and tell people you didn't do what they thought you did, but if it's done politely with an answer you leave people with a 'good taste' in their mouth and that's a good thing, because when the topic comes up in conversation they might tell a buying customer about how great you, your site and service are...
I think sometimes webmasters think too much about search engines and not enough about branding and brand building... The more who know you and your site the better and it doesn't get much cheaper than having to answer a phone, IMO.
That's my opinion only, but I would have left the pages right where they were.
| 5:00 pm on Jan 18, 2011 (gmt 0)|
removing 5000 pages of bad unique content = good.
removing 5000 pages of good unique content = bad.
5000 pages is more than enough for a standalone site, are all the pages related ?
| 6:05 pm on Jan 18, 2011 (gmt 0)|
This was good unique content. Better than what most sites have. I did it for linking, I got links from .edu's and governing bodies from the content. that's how good the content was. But frankly, Google can't tell the difference between good and bad content.
The content was all on topic. However, none of the visitors to the content were buying customers, and none of the buying customers cared about the content.
I may eventually put the content back up on another domain and if it gathers links, use that site as a link feeder to my main site. But back on my main site? Nope - removal has not apparently harmed my rankings and may have helped it.
| 6:42 pm on Jan 18, 2011 (gmt 0)|
I'm rereading this and wondering if I should't force feed this content, one page at a time, through a blog. ie. post 5 pages a day every day.
The content doesn't fit with a blog, but what the heck - maybe Google wants to see new fresh content every day in sequential format instead of just a splat of thousands of pages in a flat structure.
| 6:52 pm on Jan 18, 2011 (gmt 0)|
|I'm rereading this and wondering if I should't force feed this content, one page at a time, through a blog. ie. post 5 pages a day every day. |
What about posting them on article submission sites so as to get the link back to your main site?
| 7:16 pm on Jan 18, 2011 (gmt 0)|
Because they're not quite individual articles. More like pages from a book. Together they make sense, individually not so much.
| 8:27 pm on Jan 18, 2011 (gmt 0)|
Not sure if this is related, but on several of my sites, it seems that they have hit a wall.. 5000 pages indexed out of 20,000 or so. They are forums so the sites slowly grow over time, but seems to max out at about 5000 pages. Pages will drop from the index and newer pages will be added. I have tried everything to get all the pages indexed, but there seems to be that wall where G will only display so many of them and no more. What I found interesting is I changed the number of posts per page from 10 to 20 programmatically which actually cut down the forum pages to about half of what they originally were... same thing, about 5000 in the index of 10,000 pages with more content.