Welcome to WebmasterWorld Guest from 54.144.48.252

Message Too Old, No Replies

Pagination - KW cannibalisation and duplication

     
11:04 am on Jul 11, 2008 (gmt 0)

10+ Year Member



One site I am working on has many paginated result sets. They show up with the same titles/meta tags, and have broadly the same semantic content. The first page is linked using the target keywords, the rest use "page 2" etc.

I had always assumed that when it comes to Google, the more pages the better. Rationale being:
- more pages = more PageRank generated
- more internal backlnks to first page with KW in link text
- Google likes bigger sites

Now I'm not so sure...WMT tells me that we have many pages with duplicate titles/metas, and rankings have been suffering for awhile :-(

Pagination is needed due to the size of the results sets. Also, I get long-tail searches thanks to some of the content on paginated pages.

So...what's the best strategy here? Options I've considered so far:
- Somehow make metas distinct (add "page 2" into each meta? add something unique to generated content?)
- "no index, follow" the pages - this wd prevent duplicate metas as well as keep the PR going, but I'd lose the long-tail searches.

Any thoughts on the best option to pursue when you have paginated results to display?

9:41 pm on Jul 26, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The Toolbar PR that you now see is several months old.

Recent changes will take many months to show up.

10:18 pm on Jul 26, 2008 (gmt 0)

10+ Year Member



Well, that may be the case but cleaning up my dupe titles and descriptions seems to be having no positive effect on traffic so far. Has anyone reported an improvement in traffic after cleaning up their dupe titles and descriptions reported in WMT?
10:33 pm on Jul 26, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



How long has it been since the fixes were applied?

I'd allow at least a month for things to start moving, maybe several.

2:34 pm on Jul 27, 2008 (gmt 0)

10+ Year Member



About three weeks...after about a week I saw WMT report a minor drop in dupes, and then within the last few days every day the amount of dupes reported has gone lower. Upon checking WMT this morning I see Google has dropped the number of IBLs coming into my site again. It's not reporting quite a few decent links that I know exist, but it does show many nofollwed links on pages with no PR whatsoever...go figure!
3:27 pm on Jul 27, 2008 (gmt 0)

5+ Year Member



How did I miss this thread?!

Pagination is a major issue for me as 70 percent of my traffic comes to forum content, much of which is paginated (obviously). I've spent time thinking up ways to vary the title somehow but I don't think a single digit (page 1/2/3) is sufficient variation.

I'm now wondering if I should either block everything but the first page, or perhaps change the forum display settings so that every post is on the first page i.e. one very long page in some case.

Both of these options seem very extreme. I have to say that some of the most useful information I've found over the years when looking for stuff has been on deep forum pages.

[edited by: Asia_Expat at 3:28 pm (utc) on July 27, 2008]

5:30 pm on Jul 31, 2008 (gmt 0)

WebmasterWorld Senior Member quadrille is a WebmasterWorld Top Contributor of All Time 10+ Year Member



With forums, rather than, say, product pages, I think the problem is a little different.

With forums, I think there's a greater chance of people looking for very specific - often unique - strings.

"Did someone really say that before me?!" - that kind of thing!

So you want the pages in Google even if they do not have a unique title or descr.

And, in most cases, the *worst* that can happen is a poorer listing than unique identifiers offer - but no reason why the pages should not be indexed, and therefore findable for unique strings.

With product pages, you need to decide whether you want people to go to page one - or the page that contains their exact search terms.

For me, despite the changes in Google's 'supplementary' entries, I'd usually go for letting the punters have what they want - and letting good local links help those who wanted something slightly different.

9:16 pm on Oct 9, 2008 (gmt 0)

5+ Year Member



Will it be Ok if one just use "index", "nofollow" so that way Google index the page but doesn't follow the links including the pagination as well or its best to use noidex, nofollow.
12:00 am on Oct 10, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



It's got to be your choice, based on your goals for the indexing of your urls. There's no right answer - but using noindex,follow is certainly OK.

It would keep all those deep pages out of the index, but link juice of all kinds would still flow through to them and their link targets. That might help keep confusing duplicate titles and meta descriptions out of the index, so page 1 would not get filtered out. So this choice would require that the most important items were listed on page 1, or that there was another click path to reach the deeper items.

[edited by: tedster at 6:33 am (utc) on Oct. 12, 2008]

6:52 am on Oct 10, 2008 (gmt 0)

5+ Year Member



when i look at what Google does for its content site like knol.google. you will see that they use "index, nofollow" on all the pages except the "home page" and the "browse page = sitemap page" this leads to deeper pages. everything else has nofollow.
This 39 message thread spans 2 pages: 39
 

Featured Threads

Hot Threads This Week

Hot Threads This Month