|Is pagination a good SE practice?|
Dealing with +100k pages
| 2:32 pm on Feb 10, 2006 (gmt 0)|
Redesign my "blue widgets" site to fit Google guidelines, i find that my "New York Blue Widgets" page contains 350 links and is is too heavy ( +300k), so google cache truncates it and guidelines recomends to make smaller pages.
So i am planning to split the heavy pages. My problem is that if i split it in 10 pages all of them will have almost the same title "New York Blue Widgets" and Google may not know which is the better for this query.
I am thinking i should put a meta "noindex,follow" in every page of pagination but the first, but i am not sure if this is what i am supossed to do. Google guidelines can't help me with this problem. Do you use pagination with 100k+ results pages?
| 5:08 pm on Feb 10, 2006 (gmt 0)|
are you sure +100k/350 links pages aren't SE friendly? wow, i should change several parts of my site...
| 5:21 pm on Feb 10, 2006 (gmt 0)|
I too have a site which has pages devoted to variously colored widgets in many cities. For the past 5 years or so they have been paginated in chunks of 10-15 widgets per page - and I am not sure if it is good or bad - only logical. One thing I can tell you that the occurence of Google referrals to a "NewYorkBlueWidgets2.html" page which shares the same title with any pages before or after it is rare.
I would be curious to hear if anyone has experienced any (possibly duplicate) penalties through repeating Title tags in the above context.
| 5:27 pm on Feb 10, 2006 (gmt 0)|
I would paginate and let googlebot have access to all the pages. I'd also suggest putting the differing portion of the title in the first position -- for example "A-F New York Blue Widgets" rather than "New York Blue Widgets A-F". This accomplishes a few things:
1) it enhances the usability of the pages for tabbed browsing
2) the prominence of the different part of the title may also help Google
3) when several page titles appear in any context at all, it makes the list easier to scan
You also can and should give each page it's own, dedicated description, and a unique H1 element (even though the variation will be slight between the pages).
Even if the Google algo doesn't sort it all out well now, it may in the future or it may do OK for just some searches. If you use no-index in the robots metatag, then there's NO chance for Google to return the exact page.
| 5:29 pm on Feb 10, 2006 (gmt 0)|
I've never seen a duplicate content problem coming from page titles alone -- it is, after all, a "content" issue most of all.
| 5:41 pm on Feb 10, 2006 (gmt 0)|
|If you use no-index in the robots metatag, then there's NO chance for Google to return the exact page. |
my first idea was putting 'noindex,follow' in all pages of every set but the first, because i only want google to link to the first page of "New York Blue Widgets" Page 1, so maybe it's better to do what you say, tedster, and let G index everything.