Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

e-commerce category filters - index or noindex?

         

Kikko

7:07 am on Mar 30, 2016 (gmt 0)

10+ Year Member



Hello,

What is the current trend in online shops with filtering? Should category filters be indexed or "noindex, follow"?

My current situation is that I have hundred of thousands of URLs with filter parameters indexed. They bring about 4% of organic traffic.
Recently I added global canonicals to the corresponding category URLs. This resulted in drop to 1% of organic traffic from results with filter parameters.
Now I either want to noindex them and prune this massive number of URLs indexed (they exceed the actual number of pages in my e-commerce by 20x) or should I remove the canonical to at least bring back the 4%?

Regards!

JS_Harris

8:51 am on Mar 30, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It sounds like you have a complex set of issues to tackle. My recommendation is that you come up with a comprehensive plan and avoid testing individual things for effect, like canonical.

Ask yourself if the pages you want ranked for any given keyword/phrase are the ones you want returned. If you have several different versions of the same content fighting against each other you'll want to decide which are your best and make sure those are returned. Without seeing the sight I hesitate to say much more but I do know that if you have hundreds of thousands of pages indexed, but none receiving much traffic, you have too many indexed pages.

Kikko

1:55 pm on Mar 30, 2016 (gmt 0)

10+ Year Member



Actually, only a few (let's say around 10-15) of those >200k indexed filters are really useful, otherwise they are pretty much useless, but I cannot really make only those few indexed and rest noindex as it is limited by my CMS.

timemachined

11:59 am on Apr 9, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



I block everything that isn't a page with content and only allow media placed on pages in content to be indexed. Tags, categories, attachments, image galleries and those images that just open up to a retailer, everything else G could possibly discover, no index. G is like a naughty child getting on a stool, prizing the lock on the cabinet containing detergents.

Do some comparisons as might be per site. I only have two but G does exactly the same thing to both and I've been fighting G for two years. Removing, fetching. Best thing is no index but because G has two rules instead of one, have to watch what you add or have in robots.txt - G should accept one or other, in source or robots.txt - stupid game.

Compare a range of terms and see whether category / tag or home page shows above the post. If so, you would expect the relevant post to rank higher. Do G remove on tag and after a day do G fetch on post. See if post shows higher than previous category / tag / home page position and stays there. If post shows higher, block all other pages.

In the main, I wouldn't expect a home page, tag or category archive or one phrase mention in sidebar of a post to rank above the proper page. It's about forcing G to notice it. As I am sure I'm not the only one fed up of G's inability to rank the correct page. But as I mentioned elsewhere. I'm sure as a company they know what they're doing so as to get businesses to pay for ads.

Let's be honest, if a spider goes through a website and chooses a page with one phrase mention over a page dedicated to that phrase, not overly over optimised - just as you'd expect - and for it to choose the one phrase on a page, well it drives me crazy. They either do it on purpose, or their engineers are stupid.