Welcome to WebmasterWorld Guest from 18.104.22.168
So i am planning to split the heavy pages. My problem is that if i split it in 10 pages all of them will have almost the same title "New York Blue Widgets" and Google may not know which is the better for this query.
I am thinking i should put a meta "noindex,follow" in every page of pagination but the first, but i am not sure if this is what i am supossed to do. Google guidelines can't help me with this problem. Do you use pagination with 100k+ results pages?
I would be curious to hear if anyone has experienced any (possibly duplicate) penalties through repeating Title tags in the above context.
1) it enhances the usability of the pages for tabbed browsing
2) the prominence of the different part of the title may also help Google
3) when several page titles appear in any context at all, it makes the list easier to scan
You also can and should give each page it's own, dedicated description, and a unique H1 element (even though the variation will be slight between the pages).
Even if the Google algo doesn't sort it all out well now, it may in the future or it may do OK for just some searches. If you use no-index in the robots metatag, then there's NO chance for Google to return the exact page.
If you use no-index in the robots metatag, then there's NO chance for Google to return the exact page.
my first idea was putting 'noindex,follow' in all pages of every set but the first, because i only want google to link to the first page of "New York Blue Widgets" Page 1, so maybe it's better to do what you say, tedster, and let G index everything.