homepage Welcome to WebmasterWorld Guest from 54.161.191.154
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Harry Potter and the Curse of the duplicate Meta Tags
ColedogUK



 
Msg#: 4252819 posted 1:35 pm on Jan 13, 2011 (gmt 0)

I have a site with 200,000 pages dynamically generated. The title and description meta tags are also dynamically generated and generally this works well.

BUT where I have multiple pages of search results, results pages 1,2,3,4,5 all have the same description and title meta.

i.e.
mysite.com/productsearch/widgets1
mysite.com/productsearch/widgets2
mysite.com/productsearch/widgets3

All have:
<Title> widgets for sale in your area
<Description> Find widgets for sale on line in your area and buy at the best price.

The simple solution seems to be to add the page number i.e page 1, page 2 dynamically to the title and description. But is this unique enough?

I know SEOers love percentages in other things such as content, KW density etc.. so is there one here?

Sorry about the Harry Potter reference but you have to TRY to make SEO more interesting don't you :-)

 

Mark_A

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4252819 posted 4:38 pm on Jan 13, 2011 (gmt 0)

I would have thought that google would not be creating and then spidering searched for pages so what is the issue?

ColedogUK



 
Msg#: 4252819 posted 5:25 pm on Jan 13, 2011 (gmt 0)

Hi Mark - you are right G can't generate a search so we manually create a sitemap to show the search engines where the pages are so it can spider and index.

goodroi

WebmasterWorld Administrator goodroi us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4252819 posted 9:07 pm on Jan 13, 2011 (gmt 0)

welcome to webmasterworld ColedogUK!

having unique title tags & meta descriptions is helpful but imho not the most important seo consideration. having significant & unique text on the page is much more important. also having a strong link profile which includes external backlinks & internal backlinks is more important than meta tags.

i worked on a 500k+ page website and it had so little unique content that i ended up blocking the deep pages and concentrated my seo efforts on top 10% of pages and traffic doubled. more pages is not always better.

assuming you do have massive amounts of unique content and more backlinks than you can count, then i would see what data fields you can add to the title tag & meta description. adding "page 1" doesnt help expose you to any long tail keywords. adding a city & state data field would be better or just taking some keywords from the first search result would be better than adding page 1.

good luck

ColedogUK



 
Msg#: 4252819 posted 11:55 pm on Jan 13, 2011 (gmt 0)

I agree with most of what you say goodroi, and don't worry 80% of my time is spent researching and producing linkable (and useful) content.

But I'm focusing on cleaning up on-page elements today. If I can improve on-site elements by 5% on 200k indexed pages I see a significant increase in traffic as all these pages perfom slightly better in SERPS.

My problem is that the useable data fields are already in there and identical as the search pages are the result of the same search.

My question is by adding 'page 2'/'page 2' or 'results 1 - 20'/ results 21 - 40' enough to avoid being classed as duplicate meta by google?

Maybe I'll just have to try it and see.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4252819 posted 12:28 am on Jan 14, 2011 (gmt 0)

I've done something like what you are considering, and the site did show an incremental increase in search traffic to "internal" pages. This was back in 2009, and things certainly may have changed since then.

To be more exact, we differentiated the title elements at the first character, and we also scripted some extra goodies at the end

2. Widgets for Her | [first on page] - [last on page]

We also made sure that the navigational links channeled link equity to those deeper pages. The other step we took was not allowing other "faceted navigation" to be indexed - just one version of each list.

Someday I hope for a project where we can create internal links that are more than "1 | 2 | 3 | 4". I'm thinking a little bit of anchor text for those links might help, too. So far budget restrictions tabled that idea.

Mark_A

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4252819 posted 4:28 pm on Jan 14, 2011 (gmt 0)

Hi Mark - you are right G can't generate a search so we manually create a sitemap to show the search engines where the pages are so it can spider and index.


So how are these pages important, that they can only be found through site search?

Why can users not navigate to them normally?

indyank

WebmasterWorld Senior Member



 
Msg#: 4252819 posted 4:49 pm on Jan 14, 2011 (gmt 0)

You are already generating huge volume of site search result pages and indexing them.Why do you want to index the sub pages for a search result. In my opinion "Noindex, Follow" will be better, as i don't see a reason to get them indexed as well.

Tedster, what you say is good for multi page content on a topic.But do you recommend it for site search results?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4252819 posted 5:03 pm on Jan 14, 2011 (gmt 0)

No, I don't recommend trying to index pure site search results - Google has clearly asked us not to do that because it can generate infinite crawling spaces. But when you're listing the items from a type of product by using a database call, that's not quite the same thing as a true site search that is generated from an unrestricted form input.

indyank

WebmasterWorld Senior Member



 
Msg#: 4252819 posted 5:06 pm on Jan 14, 2011 (gmt 0)

But when you're listing the items from a type of product by using a database call


provided those items don't have their own pages that get indexed, right?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4252819 posted 5:16 pm on Jan 14, 2011 (gmt 0)

Lot's of caveats here, depending on the specific situation, size of the inventory offered, etc. Yes, it is often best to have a "browse" style of navigation and just let that be crawled and indexed.

indyank

WebmasterWorld Senior Member



 
Msg#: 4252819 posted 5:20 pm on Jan 14, 2011 (gmt 0)

yes, I too have read that google does not want us to index site search results, but i will have to add that they don't act on those who do...

some "big sites" do this too and this is a big seo strategy for them, as they rank far too many pages with these kind of auto generated title and descriptions (for all keyword variations) for product search result pages..

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4252819 posted 5:26 pm on Jan 14, 2011 (gmt 0)

Yes, if a site is a "big" brand, then a lot of Google's advice doesn't seem to apply, or at least not in the same way. However, the smaller site is wise, in my experience, to at least consider Google's advice before putting any plan into action. There is often an underlying technical reason for the advice and it is essentially there to help.

You may decide that the advice doesn't apply to your site - and that's fine, it is your site after all.

indyank

WebmasterWorld Senior Member



 
Msg#: 4252819 posted 5:32 pm on Jan 14, 2011 (gmt 0)

you are dead right there :)

Sgt_Kickaxe

WebmasterWorld Senior Member sgt_kickaxe us a WebmasterWorld Top Contributor of All Time



 
Msg#: 4252819 posted 8:49 pm on Jan 14, 2011 (gmt 0)

Worry about making sure they are all accurate, don't worry about duplicate. If adding "page xx" makes the category page accurate go with that regardless of what GWT says. It has to make sense to visitors.

aakk9999

WebmasterWorld Administrator 5+ Year Member



 
Msg#: 4252819 posted 10:44 pm on Jan 14, 2011 (gmt 0)

My take on this is following - lets say you sell broad product line of Widgets and have three categories of products:

Wagets, Wogets and Wugets

Each of these categories has 50 products and you can list 10 products per page, e.g., you would have 5 pages for each category or 15 pages of listing if listing all Wigets together.

Lets say that:

1. your products on the listing pages shows some kind of product summary for every product (a sentence or perhaps two), which (in the best case) is not a duplicate sentence from the product detail page itself, in the worst case is perhaps the first sentence or two (max) of the much more detailed product description from product detail page, e.g. listing shows list of product like:

Waget A - a beautiful top of the range wadget that performs all necessary wadget functions such as aaa, bbb, ccc. Price x.xx

Wadget B - this is one of the smallest wadgets that can be found on the market. Whilst it does not do ccc, you get great experience for aaa and bbb at much lower price. Price x.xx

Wadget C - .....etc

2. Each of your category pages can be sorted by price, size, etc. in ascending/descending order which shows the products with the same short info, but in different product order

3. Each of category pages can also be filtered by some characteristics (e.g. only show wadgets that perform aaa)

4. You can also have a search results that shows results for all Widgets, i.e. spanning all three categories together (e.g. show all wadgets, wodgets and wudgets)

So I would intend to rank category pages described under point 1. Page title (and description) for page 2 and subsequent pages would have "- Page n" added. Alternatively, Tedster's idea to include the name of the first and last product listed in the title is an excellent one (if product naming conventions allow you this to look sensible), but have not tried that.

Points 2, 3 and 4 would be the best to be set as noindex, follow as these are just permutation of listing that can potentially be many, depending on how many search criteria. If you can stop point 2, 3 and 4 URLs being exposed to search engines, even better (for example, using search form with postback, although this is not a guarantee either since if the postback returns URL with query string, someone can cut/paste query string and link to it and therefore it gets exposed to Google).

There is also possibility of:
- Stopping points 2 & 3 & 4 through robots.txt, but this may create a very long list of URLs disallowed by robots.txt being reported in WMT
- use canonical for listing pages created in point 2 to an URL created in Point 1 is also a possibility, but more difficult to employ the same for points 3 and 4 as these points either span 3 categories or are subset of one category so it is not really canonical of category listing pages you want indexed

However - I do have a case where I let Google index both, listing page as in point 1 (all Wadget products) and listing page as in point 3 (Wagets with aaa characteristics only, i.e. filtered down Wadget listing) and each is ranking for a different terms. We do however make sure that both of these cases the products displayed on the first page of point 3 are different products mix to products shown on the first page under point 1.

But if the concern is not so much ranking subsequent and/or category pages but just "fixing" Google Webmaster tools, then adding "- Page x" does get rid of duplicate title/description message in WMT.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved