Welcome to WebmasterWorld Guest from 54.167.82.170

Message Too Old, No Replies

Using Canonical tag on search results refine pages

     

speedshopping

4:06 pm on Jan 29, 2011 (gmt 0)

10+ Year Member



Hi,

We recently had a big drop of traffic to our search pages (December 29) and have yet to recover. We are considering using the canonical tag to help solve what we believe is an internal duplication problem.

Here is the scenario:

1) We moved to a much bigger server in November, which resulted in Google indexing tens of thousands of "refine" style search pages.

So, for example, we have a primary search page which previously ranked well:

domain.co.uk/search1/keyword/

However, we also have links which basically refine the search by category, such as:

www.domain.co.uk/search1/keyword/category/

Although the 2 pages are not "identically" duplicate (they pull out refined results on the same keyword but have different on-page text and meta data), would it be wise to use the canonical tag to point Googlebot to the main search page (Google has indexed tens of thousands of these similar pages and we are wondering whether this has contributed to the drop in rankings)

2) Just to add more complications, we have a 2nd search link:

domain.co.uk/search2/keyword/

which links from /search1/ page and the structure of this page works in a similar way regarding refine URLs off it.

On our previous server, we think because of the relatively low crawl bandwidth available to Googlebot, they weren't indexing so many of these refine type pages, and therefore we avoided internal duplication?

Will the canonical help in these cases? And do you think this has been the major cause in dropped rankings?

Any help is appreciated.

Cheers,
SS

tedster

6:58 pm on Jan 29, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



This seems like a type of faceted search result. The canonical link seems like a pretty big stretch to me in these situations, but I know sites are doing it.

I prefer not to allow faceted searches into the index at all - just one version. The crawling has got to be more effective as I see it.

MonkeyFace

7:57 pm on Jan 30, 2011 (gmt 0)

5+ Year Member



Aren't search results supposed to be noindex?

tedster

8:37 pm on Jan 30, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yes, if they are true site search results. But many sites are built on a PHP/MySQL framework or something parallel, so any category page is, strictly speaking, a database search result. But that is part of my thinking is recommending noindex for any faceted search or re-ranking your site can generate.

I would, however, suggest not using the word "search" in the URL itself. And even more, work to get some kind of direct navigation in place (drill down browsing, rather than search).

Robert Charlton

1:27 am on Jan 31, 2011 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The choice of categories also needs to be limited and strategic. It's not just crawl bandwidth you need to be concerned about... it's also PageRank distribution and a clear navigation hierarchy.

If you have a hotel site, you probably want to categorize for Google by city. You probably don't want Google to index categorization, say, by smoking vs no smoking or by price.

The navigation structure needs to reflect those priorities so that the pages which you choose for Google to index are also structurally your most important pages.

Even with the robots="index,follow" meta tag, you should avoid building a structure that puts your noindex pages up near the top of your hierarchy. And do not use rel="nofollow" to shape PageRank flow.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month