Welcome to WebmasterWorld Guest from 54.146.248.111

Message Too Old, No Replies

Duplicate tags showing on Google Webmaster Tools

     

My_Media

11:04 pm on Jun 5, 2012 (gmt 0)



Webmaster Tools is showing that I have some 300 duplicate tags. Most of these are my Wordpress pages which auto-generate to list the posts. I have two questions :

1. Does anybody have experience the duplicate tags issue on Google Webmaster Tools and what exactly it means?
2. Should pages be blocked in robots.txt?

Here is a snippet of my robots.txt file. Any opinions?
if(is_category() || is_tag() || is_page()){
if($paged > 1){
echo'<meta name="GOOGLEBOT" content="noindex,follow" /> ';
}
}

Sgt_Kickaxe

11:50 pm on Jun 5, 2012 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Some believe you need to employ canonical tags to redirect all $paged to the first page, I don't suggest you do this since each archive page is technically not the same.

Some believe you need to noindex pages two onwards, I don't suggest you do this since its possible one of your pages actually provides the answer someone is looking for (unlikely but still).

Some believe you need to robots.txt block everything that qualifies as $paged > 1 and I definitely don't suggest you do this, the benefits likely outweigh the cons on having googlebot crawling in your archives.

My suggestion: employ the rel="next" and rel="prev" tags on your archive navigation links to tell Google this is an archive. Then ignore the messages in GWT.

If the messages in GWT REALLY bother you add this to your wordpress page titles to make them a little more unique...

if($paged > 1){
echo' Page '.$paged;
}


Titles will then become "The Best Darned Widgets In The World Page 9" which *might* be enough to get them to stop being flagged as identical. The visitor will also know they are inside an archive instead of on its first page.

Str82u

12:22 am on Jun 6, 2012 (gmt 0)



I did something like what Sgt_Kickaxe suggests. There were about 1000 of our pages like this because of the mobile version and internal search resutlts pages. For the search results, adding the page number does the trick but it takes a while for the list in webmaster tools to go down. The duplicates caused by the mobile version I ignore for now, Matt Cutts mentioned in a video (http://www.youtube.com/watch?v=mY9h3G8Lv4k) that a mobile version should be recognized but may have hinted that a seperate domain or subdomain is better.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month