|Faceted Navigation - Best & Worst Practices|
| 8:31 pm on Feb 12, 2014 (gmt 0)|
|Faceted navigation, such as filtering by color or price range, can be helpful for your visitors, but it’s often not search-friendly since it creates many combinations of URLs with duplicative content. With duplicative URLs, search engines may not crawl new or updated unique content as quickly, and/or they may not index a page accurately because indexing signals are diluted between the duplicate versions. To reduce these issues and help faceted navigation sites become as search-friendly as possible, we’d like to: |
- Provide background and potential issues with faceted navigation
- Highlight worst practices
- Share best practices
- New faceted navigation implementations or redesigns
- Existing faceted navigation implementations
Google released this today. It's actually pretty useful; even if you don't have to use faceted navigation there are other insights to be gleaned here from what Google considers best and worst practices.
| 12:05 pm on Feb 13, 2014 (gmt 0)|
Very useful read indeed!
|Alternatively, consider placing user-generated values in a separate directory and then robots.txt disallow crawling of that directory. |
Reading this, I am wondering whether there is a difference in how Google handles disallowed directory and disallowed URL pattern? The blogpost recommends disallowing directory, but I always thought there would be the same benefit if the URL pattern is disallowed, e.g.:
| 8:46 pm on Feb 13, 2014 (gmt 0)|
For as long as I can remember, we have been operating according to what G now reveals here as "best practices" - including the recently discussed ad placement / content above fold guidelines (no ads!). So, where are our rewards? (Nowhere to be seen . . . .).
| 10:23 pm on Feb 13, 2014 (gmt 0)|
FWIW, this problem is nothing new which is why I've been stuffing canonical tags and redirects into websites for years so that pages with all the options in the URL don't get indexed unless we explicitly want to them be indexed. While you can't control what links people make to your website, you can certainly control when ends up in Google by refusing to allow those bogus links to be indexed which I've done for many years.
I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago.
Oh well, give them enough time and they'll even figure out authorship.
|So, where are our rewards? |
Google is training you: Jump little monkey jump!
Google has somehow managed to single-handedly become synonymous with the Internet and webmasters blindly follow whatever they say like the trained minions they have become.
George Orwell would truly love the irony that we invited it and did it to ourselves ourselves, that no political or government organization forced it on us, or would he?
| 12:33 am on Feb 14, 2014 (gmt 0)|
|"Google is training you: Jump little monkey jump!" |
|"The green reed which bends in the wind is stronger than the mighty oak which breaks in a storm." Confucius |
While our applying "common sense" to site layout and URLs predates Google's "best practice" "directives" by more than a decade, we find adapting to the current environment, whenever necessary, a "realpolitik" much preferable to heroic death.
|"Adapt or perish, now as ever, is nature's inexorable imperative." |
| 12:46 am on Feb 14, 2014 (gmt 0)|
|I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago. |
Maybe with the growth of the web these multiple URLs are becoming a bigger issue as they require additional crawl capacity, hence they are addressing it now.
| 1:02 am on Feb 14, 2014 (gmt 0)|
|Maybe with the growth of the web these multiple URLs are becoming a bigger issue |
I've been doing ecommerce since '96 and they've always been a problem.
Google didn't even exist yet, this is 18 years later.
My point was they never tried to help us figure it out back in the day, trust me we weren't shy about asking, and just left us twisting in the wind, as Google is oft to do until canonical tags finally came along.
More transparency wouldn't kill them, a little late is better than never I guess.
| 6:46 am on Feb 14, 2014 (gmt 0)|
I find it a bit amazing that after all these years G is providing SOME insight into what they really like... and is apparently straining their capability just enough they shared this info. :)
Can you imagine how different the web might have been if G had shared their good tips, with examples, early on? ... Oh... that might have revealed how to game their system.
| 5:18 pm on Feb 14, 2014 (gmt 0)|
I use form post onClick for filtered results, the target content is NOINDEXed. Every request is filtered for proper params being passed via URL Scope, their case and order. Advanced site search form is disallowed in robots file. Nothing gets indexed or crawled that I don't want to.
Otherwise it is a target practice for competitor posting funky links to the site and a dive in SERP, been there, unfortunately.
Not User friendly, but then again -we live in a Gorgs World.
|I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago |
Several sites got hit in Florida, cleaning up and locking down URLs did the trick, some time later....