homepage Welcome to WebmasterWorld Guest from 54.161.181.49
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
99% of traffic coming from 20% of pages, post Panda gameplan?
Sgt_Kickaxe




msg:4441109
 8:06 pm on Apr 15, 2012 (gmt 0)

In the past I haven't worried about pages not being picked up well by search if they had value for visitors but I'm revisiting that noting since they are not being seen anyway. 99% of search traffic is landing on just 20% of pages.

Question:
- Is there a glass ceiling in which only a set % of a site's pages can be top ranked?

Possible changes:
- Remove the ignored pages completely (ignored being 1 visitor per day or less).
- Special internal link setup ensuring non-trafficked pages get buried deep in the archives.
- Stop creating newer content in favor of re-working older content to get it more traffic.
- other?

Considerations:
- non-trafficked pages that have solid incoming links
- use removed pages to create a new site? (just a thought)

Identification:
- Using Google analytics you can select a period of time(e.g. this month) and compare it with another set of data(e.g. the previous two or three months) and then switch to search traffic view and look for pages that now get no traffic that previously did(perhaps a Panda downgrade) or worse spot pages that got no traffic at all by comparing the urls to those in your sitemap.

While not a perfect way to judge a pages volume it's not horribly inaccurate either. Just how much attention is warranted by pages that receive NO search traffic in a post Panda internet? What's your gameplan or take on this?

 

Pjman




msg:4441167
 11:48 pm on Apr 15, 2012 (gmt 0)

@Sgt_Kickaxe

Glad you brought this up. I have been thinking the same thing.

I have been planning on reviewing pages that have low visit counts and "noindex" them for the short term and then rework them over time before removing the "noindex" tag once they have better content than any of the competitors that are ranking much higher.

I do find that post panda (with a Pandalized site, it doesn't mater how much better those pages get, I can't seem to gain rank on those pages.

I ran a test now 4 months running. I took a page that use to be #1 pre-panda, now #22 post-panda. I looked at the top ten sites and could honestly say 4 of the top 10 ranking pages were better.

I beefed up the page something outrageous. The content blows everybody out of the water, seriously. 20 human reviewers looked at my page and the other top 20 pages and clearer, unanimously say it is superior in all facets. But, it still only ranks #19.

I guess as @RustyBrick points out, you have to improve all indexed pages before you can gain any head way at all.

aristotle




msg:4441179
 12:44 am on Apr 16, 2012 (gmt 0)

In my opinion, on a well-designed site the pages will tend to support each other. For example, even if a particular page doesn't get much direct traffic from Google, it still may be helping the rankings and traffic of some of the other pages.

I believe that the Google algorithm looks at the whole website for determining which keywords and/or key phrases to associate with it. I also think the algorithm includes both indexed pages and non-indexed pages in its overall evaluation, including pages that have a noindex tag in the header.

canuckseo




msg:4442096
 6:03 pm on Apr 17, 2012 (gmt 0)

I have also noticed post panda that the number and quality of links affects the number of pages indexed more than it used to. I've been testing on a few sites using different strategies.

Some sites I'm manually updating pages - providing better on-page text (more useful). On some sites I'm bulk updating content (auto-generating random unique page content - I know against the rules but how else can I re-write tens of thousands of pages efficiently) and for my third group just doing link building.

So far the most effective short term strategy is the bulk content updating however i don't see this a long term strategy. Links are working but take longer and manually rewriting pages also seems to have an impact but not as great as links and takes the longest to be recognized by Google.

Just my 2 cents.

econman




msg:4442121
 7:01 pm on Apr 17, 2012 (gmt 0)

In another thread, this set of comments caught my eye:

A thin page may not be what you think.

Huge article websites lost massively due to PANDA where article pages have lots of text. The quantity of pages compared to the available link juice is the problem.

...we have all been PANDAized some just feel the effects more than others because of (for lack of better words) regurgitated pages to target every longtail phrase was the model of choice... but now it does not work.

That said... all the pages are still there (indexed) and if they deserved ranks they would each have a few links to them. Granted you can't develop 3 million links for 1 million pages but you don't need to... just pick your ponies wisely.


Source: [webmasterworld.com...]

I think the author of this comment was suggesting that in the post-Panda environment, pages that have zero external links pointed at them are not as valuable for SEO purposes as those same pages were in years past.

Perhaps I'm misinterpreting him, but I think he's arguing (or, at least someone could plausibly argue) that if your site has 5,000 pages, but only 50 of those pages have strong enough content to attract external links, as a result of Panda your site no longer has an advantage over an equivalent 200 page site with 50 equivalent pages of equally strong content attracting an equivalent set of inbound links.

I have no idea if this hypothesis is true but I think there is enough merit to it to justify some deep thought.

What's intriguing is that, as a working hypothesis it provides some insight into some of the tactical issues we all face concerning where we should invest our efforts.

Should we improve our weak pages?

Kill them off entirely?

Focus our efforts on trying to attract links to just the 50 strongest pages, or a much wider set of pages?

The OP in this thread touches on one aspect of this thought process -- identifying pages that receive no inbound referrals from search engines.

But, clearly this is not the only thing to consider. For instance, are those pages being read by users reaching them from other pages on the site? Is the content on those pages helping support the overall "theme" of the site, and providing indirect support for the pages that are ranking?

garyr_h




msg:4442386
 11:12 am on Apr 18, 2012 (gmt 0)

I've been wondering *something* similar.

If you have a user content area, say a forum added onto the website. Obviously many of the threads are not going to have external links pointing to them. Does that in itself hurt the website?

For example, I have a site that is 1,000 pages content created by me. Most of those have links to it. Then there are around 50,000 created by users with only maybe 1,000 of those having external links pointing to them.

Does that mean the user content area is hurting the entire website? If it is, that's ridiculous. As well, if it is, would the person be better off putting the user content area on a subdomain and not linking to it from the main website?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved