Forum Moderators: Robert Charlton & goodroi
If the pages on a site (that you want in your navigation) fall into 10 main categories, what's better:
1 - A nav bar with 10 main categories, and use javascript mouseover to show links to subcategories (issue - spider and js)
2 - A nav bar with 10 main categories, and use css mouseover to show links to subcategories (issue - hidden divs - is this a problem with Google?)
3 - A nav bar with 10 main categories, and when you click on a category page, it has a link to all the subcategories (issue - all those subcategory pages are not on the home page)
From the user's point of view, options 1 or 2 seem best. But does Google penalize option2 because it sees the navigation as hidden divs?
But does Google penalize option2 because it sees the navigation as hidden divs?
Not if they don't know about it (for sure). Would it hurt if they did? I wouldn't venture to guess, because it's not something I've ever really done a 'case study' on...
I've read about people here, and have myself, blocked external CSS files via robots.txt (JavaScript too for that matter), because there's really no reason G needs to see either. Usually, if I do something like you are talking about with CSS (or JS) I make sure I link to the same pages within the page text too, so the 'hidden links' (if they are considered hidden) match real text links on the page.
Personally, I would opt for 1 if you can also link to the pages within the text of the page, so SEs still find links to the pages. Then I would probably seriously consider 2 over 3, because of your 'better for visitors' statement, but I'm working on low sleep and my general thinking lately is people need to quit kissing G's a** and build sites for the real visitors.
However, the situation can throw Google and other engines a bit of a semantic curve ball. This happens because links and anchor text to every page in the menu system now exist on every page in the entire website. This obscures the site's Information Architecture and it becomes a good bit more likely that Google will rank "the wrong page" for a keyword.
There is also a tendency for sites to have too many links on a page when they use mouseover menus or hover menus. I've done damage control for sites with over 1,000 links on every page because of this! See The Mega Menu Problem [webmasterworld.com] for more discussion.
There are also usability problems with ANY hidden menu system. One of the biggies is that the visitor can not see and compare all their choices in one view. All those hidden options tend to limit site exploration, rather than encourage it. See Mouseover Menus - or DHTML indigestion [webmasterworld.com] for more.
It takes a lot of work to create a logical Information Architecture - one that is based on the visitors needs rather than the organization's internal structure. Employees and web teams often make the error of making the visitor learn their own Org Chart, rather than structuring their pile of information for the uninitiated visitor.
However, that tough work has a major payoff when properly done. I've done it several times and the conversion data has shown very strong improvement. Stickiness metrics, such as page views per unique, also improve with a visitor-oriented IA.
The concern about requiring extra clicks for drilling down into a conventional "inverted L" menu has proven to be a red herring. The three-click rule has been discredited [webmasterworld.com] in actual testing. As long as the visitor is getting what Jakob Nielsen calls a strong Information Scent [useit.com] from the menu labels and other on-page cues, they will keep clicking on the non-hover menus and drilling down.
To sum up, while there is very little risk of a hidden content penalty from any of the three options, there are several other strong reasons not to choose any of the three.
[edited by: tedster at 5:00 pm (utc) on Sep. 26, 2009]
Generally speaking, it's best if the links are in plain HTML and not generated by the JavaScript. In other words, it's best if the JavaScript is only controlling the display of the links. While Google is becoming more sophisticated at analyzing JavaScript to discover links, it's become a bit of an open question as to whether or not those links are treated the same way as plain HTML links in terms of things like PageRank and anchor text.
If the links in your navigation system are generated by JavaScript and you want to keep it, it seems to me that it would be a good idea to add a <noscript> section on the page that holds the same links in plain HTML just to make sure those links are seen.