Welcome to WebmasterWorld Guest from 54.145.13.215

Message Too Old, No Replies

Pagination - Best Approach for Google?

     
6:30 pm on Mar 11, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Feb 21, 2010
posts: 47
votes: 0


What tips or advice can people give for pagination of large categories so they remain google friendly and do not cause duplicate content penalties
8:19 pm on Mar 11, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
posts:1826
votes: 21


For MySQL the GROUP BY query will stop duplicate content in pagination.
10:24 pm on Mar 11, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Feb 21, 2010
posts: 47
votes: 0


Thank SEOPTI thats a good tip :) But the category's aren't duplicating that way.

I was more thinking about duplicate title and meta descriptions in pagination possibly causing an issue

Has anyone experienced that?
10:44 pm on Mar 11, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
posts:1826
votes: 21


Just use the first element from your body in your title+description for each pagination url.
11:30 pm on Mar 11, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Feb 21, 2010
posts:47
votes: 0


Thanks SEOPTI I have now done that. Good to hear that is enough..:)
11:30 pm on Mar 11, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


The biggest problem with pagination is one that I rarely - if ever - see addressed.

What happens to content in categories that are growing, where something is on page 2 this week, is moved to page 3 next week, and then moved to page 4 the week after?
11:42 pm on Mar 11, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
posts:1826
votes: 21


g1smd, in this case the <title> of each pagination URL would change since it is the x element from body therefore this approach is better for static sites.
2:03 am on Mar 12, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Feb 21, 2010
posts:47
votes: 0


I think g1smd makes a good point, content can move about with pagination. Also content can reduce so where there was once 33 items there is now say only 25. If it were a static site you would be left with page 3 of a category with no content and on many dynamic sites you could end up with a orphaned page 3 that still is served but has no content but because was previously spidered keeps getting indexed. So try and solve I serve 404's in this circumstance and end up with a supplemental result.AHHHH! Pagination is hard

Thanks 4 the Replys
3:04 am on Mar 12, 2010 (gmt 0)

Full Member

5+ Year Member

joined:Sept 30, 2006
posts:332
votes: 0


Alternatively, serve a version of your site to googlebot which has no pagination - just one long page with all the items. This is OK if you have less than, say, 10 pages of 10 items each. Then, no pagination issues, no dupe content issues, and no worries about adding or deleting to the list.
3:24 am on Mar 12, 2010 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 16, 2004
posts:91
votes: 0


@helpnow so you're recommending cloaking? :D

You have to paginate, no way. Solve the duplicate titles issue and find a place to place the "most important articles" on your side bar, at least for the 1st of the paginated pages, so these do not get lost.

Also, deeplink from article to another, and remember the old ones. That way you're -almost- done.
3:32 am on Mar 12, 2010 (gmt 0)

Full Member

5+ Year Member

joined:Sept 30, 2006
posts:332
votes: 0


Cloaking is serving different information to the real world than to a bot, typically in an attemp to deceive the bot. I don't consider it cloaking if you serve the same info from 3 pages on just one page for a bot instead of spread out over 3 pages. The bot sometimes treats page 1 as the most important anyway, and sometimes ignores pages as you go down the row - by page 10, the bot usually walks away anyway. We rarely got a paginated page below page #20 indexed anyway. Might as well make it's job easier and just give it everything (or up to 100 results or so) on page 1, and nofollow/noindex the rest of the pages.

Anyway, this has worked fine for us. Better, in fact. But, -shrugging-... each to his/her own! ; )
9:10 am on Mar 12, 2010 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:995
votes: 48


If we're talking about paginating items in a list (i.e. articles, widgets, whatever) then I find that just inserting 'Page 1' 'Page 2' etc at the end of the title tag will make it sufficiently unique. I'd not have a description tag for these pages though.

Watch the urls though to avoid dupes:

/pagination.php
/pagination.php?page=1
/pagination.php?page=first

all the same.

So either set up 301s, use the canonical tag, or (best solution in my view) sort out your pagination function.

Another thing I see is pagination classes without enough pages i.e. there are 20 page of content but the pagination goes

1 | 2 | 3 | 4 | 5

then

2 | 3 | 4 | 5 | 6

I'd always try to have the maximum number of pages in pagination.

Finally, if you have more than 20 pages for anything other than search results, I personally would say you need to look at your site hierarchy again in most cases.
12:41 pm on Mar 12, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 10, 2007
posts:145
votes: 0


I use dynamic description tags and page numbers in the title which google seems to understand pretty well because it often indents multiple pages from a thread when there are no others on that topic...
11:42 pm on May 9, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member

joined:Apr 14, 2010
posts:3169
votes: 0


Has anyone with a very large category of page after page of the same TYPE of articles/items/whatever tried to use the canonical tag to point all pages to page 1? All of the actual articles being linked to from pages past page 1 could be linked to via other relevant articles to avoid them being lost.

~ This wouldn't cut off any articles but would make for an interesting internal link structure.
11:45 pm on May 9, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I do know one business who tried this - and it seemed that Google ignored the tag. They do reserve that right if the suggested canonical page isn't a very close match to the originally requested URL.

It sounds like you would get the effect you're after with a noindex,follow on the deep pages.
1:20 am on May 10, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2003
posts:732
votes: 0


Great tips, FranticFish.