homepage Welcome to WebmasterWorld Guest from 54.161.175.231
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Faceted Navigation - Best & Worst Practices
netmeg




msg:4644557
 8:31 pm on Feb 12, 2014 (gmt 0)

Faceted navigation, such as filtering by color or price range, can be helpful for your visitors, but its often not search-friendly since it creates many combinations of URLs with duplicative content. With duplicative URLs, search engines may not crawl new or updated unique content as quickly, and/or they may not index a page accurately because indexing signals are diluted between the duplicate versions. To reduce these issues and help faceted navigation sites become as search-friendly as possible, wed like to:

  • Provide background and potential issues with faceted navigation
  • Highlight worst practices
  • Share best practices
  • New faceted navigation implementations or redesigns
  • Existing faceted navigation implementations



Google released this today. It's actually pretty useful; even if you don't have to use faceted navigation there are other insights to be gleaned here from what Google considers best and worst practices.

[googlewebmastercentral.blogspot.com...]

 

aakk9999




msg:4644749
 12:05 pm on Feb 13, 2014 (gmt 0)

Very useful read indeed!

Alternatively, consider placing user-generated values in a separate directory and then robots.txt disallow crawling of that directory.

example.com/filtering/articles?category=health&days-ago=7
....
with robots.txt:

User-agent: *
Disallow: /filtering/


Reading this, I am wondering whether there is a difference in how Google handles disallowed directory and disallowed URL pattern? The blogpost recommends disallowing directory, but I always thought there would be the same benefit if the URL pattern is disallowed, e.g.:

example.com/filtered-articles.php?category=health&days-ago=7

robots.txt:

User-agent: *
Disallow: /filtered-articles.php

heisje




msg:4644926
 8:46 pm on Feb 13, 2014 (gmt 0)

For as long as I can remember, we have been operating according to what G now reveals here as "best practices" - including the recently discussed ad placement / content above fold guidelines (no ads!). So, where are our rewards? (Nowhere to be seen . . . .).

.

incrediBILL




msg:4644951
 10:23 pm on Feb 13, 2014 (gmt 0)

The problem with all this is those faceted navigations were typically all in javascript so we never worried about the SE implications because they couldn't read the javascript in the first place. Now that the SE's are all up in our javascript all this has changed and the next thing you know they'll be telling us how to properly write that javascript so it's easiest to crawl.

Just wait for Google's Best Javascript Programming Practices coming soon to a Google FAQ page near you and I'm betting they recommend you use Angular.

FWIW, this problem is nothing new which is why I've been stuffing canonical tags and redirects into websites for years so that pages with all the options in the URL don't get indexed unless we explicitly want to them be indexed. While you can't control what links people make to your website, you can certainly control when ends up in Google by refusing to allow those bogus links to be indexed which I've done for many years.

I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago.

Oh well, give them enough time and they'll even figure out authorship.

So, where are our rewards?


Google is training you: Jump little monkey jump!

Google has somehow managed to single-handedly become synonymous with the Internet and webmasters blindly follow whatever they say like the trained minions they have become.

George Orwell would truly love the irony that we invited it and did it to ourselves ourselves, that no political or government organization forced it on us, or would he?

heisje




msg:4644976
 12:33 am on Feb 14, 2014 (gmt 0)

"Google is training you: Jump little monkey jump!"

Well,
"The green reed which bends in the wind is stronger than the mighty oak which breaks in a storm." Confucius

While our applying "common sense" to site layout and URLs predates Google's "best practice" "directives" by more than a decade, we find adapting to the current environment, whenever necessary, a "realpolitik" much preferable to heroic death.

"Adapt or perish, now as ever, is nature's inexorable imperative."



.

aakk9999




msg:4644977
 12:46 am on Feb 14, 2014 (gmt 0)

I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago.

Maybe with the growth of the web these multiple URLs are becoming a bigger issue as they require additional crawl capacity, hence they are addressing it now.

incrediBILL




msg:4644979
 1:02 am on Feb 14, 2014 (gmt 0)

Maybe with the growth of the web these multiple URLs are becoming a bigger issue


I've been doing ecommerce since '96 and they've always been a problem.

Google didn't even exist yet, this is 18 years later.

My point was they never tried to help us figure it out back in the day, trust me we weren't shy about asking, and just left us twisting in the wind, as Google is oft to do until canonical tags finally came along.

More transparency wouldn't kill them, a little late is better than never I guess.

tangor




msg:4645024
 6:46 am on Feb 14, 2014 (gmt 0)

I find it a bit amazing that after all these years G is providing SOME insight into what they really like... and is apparently straining their capability just enough they shared this info. :)

Can you imagine how different the web might have been if G had shared their good tips, with examples, early on? ... Oh... that might have revealed how to game their system.

blend27




msg:4645301
 5:18 pm on Feb 14, 2014 (gmt 0)

I use form post onClick for filtered results, the target content is NOINDEXed. Every request is filtered for proper params being passed via URL Scope, their case and order. Advanced site search form is disallowed in robots file. Nothing gets indexed or crawled that I don't want to.

Otherwise it is a target practice for competitor posting funky links to the site and a dive in SERP, been there, unfortunately.

Not User friendly, but then again -we live in a Gorgs World.

I'm just shocked it took them over 10 years to publicly address the problem after the rest of us battled it out and resolved it a long time ago

Several sites got hit in Florida, cleaning up and locking down URLs did the trick, some time later....

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved