I'm curious if anyone has any case studies or experiences with this:
Will adding a moderate amount of pages to your site (say, 10-15% increase) work to dilute your sites overall pagerank/link equity (not toolbar PR, but algorithmic PR)?
Will this potential dilution be linear in relation to the amount of pages added? By which I mean, if I add 10% more pages to my site, will my existing pages be garnering %10 less link juice?
I tend to think that it cannot be as simple as this, and there must be a more complex relationship regarding domain authority/page rank and site saturation.
Any insights would be most helpful.
You're right, I think - it won't be as simple as a direct 10% to 10% relationship. A lot is going to depend on internal linking structure. Also, since PR is only one ranking factor, there's no predicting rankings based purely on PR considerations.
In my opinion adding more content to a site is usually a good thing. AS Tedster said, you can still control the PR flow within the site. Any PR you give to the new pages can be redistributed back to the old pages again. Also, according to the original PR formula, each new page will be given a small initial pagerank value of its own
I'd bet my bottom dollar that Google moved on from pagerank long ago. They probably have zonerank active(location on page where links are determines passing value %) and I'd be surprised if they didn't use typerank(the type of page incoming links are on) and social rank(how active is that social circle?) too.
If Google didn't refine rank systems they'd eventually throw it out and that's a possibility too. For longtail keywords perhaps internal link structure has been reduced as a weighting factor and number of incoming anchor text increased. That's not a typo, mashup sites saw massive increases in traffic for some reason which means something has to account for it such as covering every conceivable anchor text combo a bazillion times internally.
My point is... the SEO game has been turned on it's head by Google over the past 60 days and something major has changed. evidence: good sites like WW lost traffic and MAJOR mashup sites gained.
I would interested to consider the opposite as well. What if you are removing content that is rarely visited or low in quality, and removing internal links that are somewhat repetitive and not necessary to navigate the site?
|I would interested to consider the opposite as well. What if you are removing content that is rarely visited or low in quality, and removing internal links that are somewhat repetitive and not necessary to navigate the site? |
It's true that removing some pages would allow you to focus the available pagerank "juice" on the remaining pages, thereby increasing their PR.
But personally I don't like the idea of removing established pages that are already indexed. So if it were me, I would look at the possibility of revamping those pages, especially by adding more content (relevant content!) to them and keeping them (or some of them) on the site. Also, instead of completely removing them, you can "go halfway" by reducing the number of links to them, still keeping them on the site, but freeing up some Pagerank juice to focus on the most important pages.
I would like to add that one technique that seems to have worked well on my sites is to add an extra page just for the purpose of bolstering the rankings of an existing page. What I do is put related content on the new page, including the targeted keywords, then link it to the main page. Sometimes I've even been able to bolster several existing pages with a single new page.
When we discuss PR transfer, we often think about it as a one-time vote from page A to page B. But in reality PageRank is an iterated calculation, around and around the web. A lot of the "big effects" we tend to look for when we use the simplified mental model get mellowed in reality out by the iteration.
I think that the "more pages of targeted content" approach is currently showing even better returns. One of the targets of Mayday was individual pages that happened to rank, in contrast with "sites" that were relevant (they hoped). Seeing more on-topic pages for a given search could well be one of those site-wide factors.
When I'm assessing how easy it will be for a site to compete, I often look at who's ranking right now and do a search like [site:example.com keyword]. If a site has a section of 1,000 targeted pages, then competing with it can rake a lot of content development!
|I'd bet my bottom dollar that Google moved on from pagerank long ago. |
PageRank is now just one factor of many, but I think it's still an important factor. I believe they've changed a bit how exactly the value is distributed out to links...based on the link location on the page, css visibility, etc...but I've not seen anything that indicates that it's not still a major part of the overall algorithm.
I think the value of having the potential to rank for more different keywords, the increased relevance gained by having more content on the keywords you already have, and the potential to generate links to new content far outweighs any PR dilution you may experience.
When I have added large amounts of content all at once, it has not noticeably decreased my rankings for main category keywords.