homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 71 message thread spans 3 pages: < < 71 ( 1 [2] 3 > >     
The "Mega Menu" Problem and Google Rankings

 8:47 am on Jul 1, 2008 (gmt 0)

Over the past year and a half, I worked with a number of -950 problems. In doing that, I noticed that many of these sites (though not all) had a characteristic that I started to call the Mega Menu. It comes in two forms, but both of them place a lot of anchor text on every page.

Type #1 just has a long laundry list of links down the left hand side (and often across the footer) and Type #2 uses a CSS hover menu so the page LOOKS neater, but the code still holds many, many links - especially if the menu goes into a second or third level.

Why could this create a problem? I have two ideas. First, we know that anchor text is a heavily weighted element in the algorithm. All that opportunity to go over the top with keywords in anchor text cannot be a good thing. And second, Google reps keep repeating the advice not to go beyond 100 links on a page. Even if more links are now spidered, just think of the pile of semantic confusion that this can throw at the relevance calculations.

So I started looking at the client sites that are really cooking - and guess what, they often are between 30 to 40 links on the home page, no more.

On one well ranked site I was called to help with, we improved even further by dropping from about 60 links in the template to only 22.

Using their analytics program we discovered that one of the home page links got over 60% of the clicks, and 10 of them got 99%! The other were there for "SEO" purposes, but the customers apparently could care less. So we dropped a ton of that anchor text from the site template, and saw rankings get even better on the best terms.

On another site, there was a 3-level hover menu that, even before any content area links, offered 130 anchor tags. We re-thought the Information Architecture and moved the site into a classic "inverted L" format, dropping down to about 45 links per page. Again, everything about the site started to perform better, rankings, stickiness, conversions, you name the metric.

Even without SEO concerns, hover menus can be a problem for visitors. For one thing, they cannot see all their options at any one time. For another, it's easy to end up with several links that have the same anchor text but point to different pages. Now that's confusing both for people and for algorithms!

It can be mind-bending work to to generate a good Information Architecture -- and even harder work to choose optimal menu labels. It's much easier to grow a site by just slapping more links onto the menu. However, from what I've seen, slim and trim is the way to go with menus. Google seems to like it a lot, and a visitor's first impression of such a site is "I can deal with this", not "sheesh, where do I start?"

I'm not saying you can't succeed with a Mega Menu, only that it can be more problematic. Especially when you've been programmed with the SEO mantra of "links, keywords, anchor text, links, keywords, anchor text" for many years, it's easy to go wrong without even noticing.

Anyone else have experiences with different sizes of menu? Maybe you have results that run counter to mine, or may they support mine. I know that we each see our own particular slices of the total web and not others.

Whatever the case, I'm interested in how my Mega Menu idea looks across many markets.

[edited by: tedster at 3:18 am (utc) on July 2, 2008]



 1:33 pm on Jul 3, 2008 (gmt 0)

Amazon really doesn't do it they may have a large menu but it is different for each catogery and really a nice way to get to things on the site. I really can't see any way to compare what tedester is talking about to Amazon.

Just a note my left side menu had about the same number of links as Amazon so I can see the issue.

I do have some concern as to the PR flow as brought up by lorel. The page I am now linking to with all the links that was my Menu is not cached and and grayed out, but as ALbino says if this will get me out from under this filter it will be well worth it. I have a strong site link wise and all so this just may be the trick that will lift the filter.

Thanks Tedester will post back if something changes good or bad as I do have some really good rankings in yahoo and Msn and will be watching them for any changes.

[edited by: bwnbwn at 1:47 pm (utc) on July 3, 2008]


 1:43 pm on Jul 3, 2008 (gmt 0)

Hi Tedster, how long after the changes have you seen the improvements of rankings?


 3:49 pm on Jul 3, 2008 (gmt 0)


Have the improvements in rankings been long term?
How about a mega navigation link page inside of an iframe off of the home page, any experience with this?


 3:59 pm on Jul 3, 2008 (gmt 0)

I had the same issue with a customer of mine, I added nofollow to all the links in the menu except the 3 links I wanted to promote, and added an html sitemap for crawlablity purposes for the rest of the site.
Don't you think that this can solve the problem?


 4:32 pm on Jul 3, 2008 (gmt 0)

I was going to say Yahoo!...but then noticed they've changed their main page (I don't go there at all it may've been a while), and it looks like there are less than 100 links on it. MSN.com, however, has definitely more than a 100, no wonder it can't keep up :)


 4:41 pm on Jul 3, 2008 (gmt 0)

tedster, I just did a quick quality check on a menu and it made 20 http requests to the server. Oh, there were all sorts of neat little gadgets in that thing. How do you think this ties in with the Mega Menu mindset? Exactly how much leakage is occuring with all those third party whoopdedoos? I'm not talking PR leakage either. ;)


 5:05 pm on Jul 3, 2008 (gmt 0)

Hi All

Does anyone here use the treeview navigation system ?


 5:47 pm on Jul 3, 2008 (gmt 0)

mega menu, in a SEO point of view, would not be the same using a sitemap?

In essence, yes - for smaller sites, it is like putting an entire user's site map on every page. We're in the Google forum here, but as a side note, Yahoo once explicitly told a client of mine that this was the reason for a penalty they received.


Their current approach of changing the side menu depending on the category is a decent approach. when they first expanded beyond Books, they tried all kinds of approaches, including 19 categories on every page and a DHTML show/hide menu. By restricting the links to those within the category, they are getting a better semantic focus in various parts of their site.

20 http requests to the server. Oh, there were all sorts of neat little gadgets in that thing. How do you think this ties in with the Mega Menu mindset?

Excessive http requests may become the next big buzz, as Google ups the ante on measuring user experiece - and the content of all those gizmos cartainly can blur the metrics as well as slow the page load speed.

I used to think of search engines almost like dealing with a learning disabled child who also had a hearing problem. Now the learning challenge is still there, but Google's hearing became extremely sensitive. If you yell at it, or give it too much input at one time, you may stimulate a tantrum!

...treeview navigation system

Many times the programmer who creates these in a generic form has not given thought to any issues beyond just getting the thing to work! If the hidden menu links are spiderable, and the number gets too large, it can have the same kind of mega menu problems I mentioned above. Just because the extra links get displated with a click rather than a hover doesn't change the core risk.

Since the end user is often familiar with this kind of interface from Windows Explorer, it can be more comfortable from a usability point of view. Still, as a general rule, website menus are not like applications that a user will be involved with every day - and so they deserve a different mindset when they are being decided.


I'm with steveb on this one. The alphabet can be a very useful hierarchy, easy to understand and you can get really granular. However, it sure doesn't give much extra semantic information.

Sometimes a top level of categorization can be combined with an alphabetical system WITHIN each category to give a much more informative and manageable approach.


 5:58 pm on Jul 3, 2008 (gmt 0)

A lot of mega menus are unordered list with some sort of hierarchy with nested lists, order of list etc, it shouldn't be hard for search engines to work out level of importance with these types of menus, it can actually argued that it makes it easier to work out the importance of these pages and I'm sure search engines algorithms should reflect that and if they don't the should.


 6:30 pm on Jul 3, 2008 (gmt 0)

tedster, you've got this way of making people see things at times... :)

Okay, now that we've discussed some of the challenges that Mega Menus present, what can we do to change? Lots of good advice in this topic and I want to bring the Wiki into this again. I'm really fond of the Wiki and that whole platform. I think it is a work of art technically. The Wiki has two primary things that "I like" and those are...

Intuitive URIs

The Wiki have provided many access points to their content. I prefer using search most of the time but I've come to learn their taxonomy and I know I can usually find primary entrance points by using single word URI appendage...


And usually end up where I need to be. Once you've used the Wiki for a while, you tend to just switch to URI navigation or what I call Intuitive URIs. I see more and more invalid request like this on specifics types of websites. In some instances it may be a rogue bot, in others it might be someone trying to see if they can use a word at the end of the URI and get to where they need to be. They see that you use a short, single word taxonomy and can usually figure out the paths after a few browsing sessions, those are Intuitive URIs.

I'm sure there are many still wondering how to take a db of 2,500,000+ products and avoid the Mega Menu Madness. Its difficult. And to undo a Mega Menu is a task in itself. You could expect many man/woman hours in the undo process, I know. And, a lot of testing, testing, testing and more testing. And then some undoing after launch. It never fails, something gets missed when you have to traverse through years of programming layers. And, I don't want to hear the "Ah, that shouldn't happen!". Well, it shouldn't, but in the real world, it does. And, usually 9 times out of 10 something is overlooked, sometimes minor and in some instances, sometimes major. :(

I know we're going to get the crowd that says "my competitors are doing it" or such and such does it that way. Don't even get me started. I've found that your competitors and/or such and such, can cause you more harm just by following what they are doing. When the hammer comes down, your right there with them, doh!

If someone were to come to me today and say Edward, I have a db of 2,500,000 products and its a brand new website, I'd have a much different approach to the taxonomy. In fact, I think the menus would come last. I want my dynamic menus at that point. Okay, so where is the user? Then our menu should be this, this, and that, period. Remember, there are many other areas where you can provide navigation for users that don't require a menu system. Look at the Wiki! Inline Navigation, Multiple Access Points...


 7:15 pm on Jul 3, 2008 (gmt 0)

pageoneresults just yesterday I was kinda looking at this whole thing thanks to tedester and had to do some work in IIS and boy do I ever wish I knew then what I know now.

There would have been a world of difference in setting up the site architecture.

When I set the site up Mega menus were the only way to go so were many other things that have changed. I am really glad I started a complete overhaul on the site about 7 months ago this really is the last thing I retained I guess it's like the Book "Who moved My Cheese" and one of the old things I was hanging onto as a way to draw more kewords to the site.


 7:37 pm on Jul 3, 2008 (gmt 0)

Why? LOL, there is nothing ugly about Sa-Se compared to S-T.

If you use the alphabet, you aren't exactly aspiring to Shakespeare. Breaking the alphabet down into 50 units is easily understood and users would have no problem with it. It's not as ideal as "red widgets" and "green bodgets" but it's logical and Google would have no problems with it.

I guess I should mention that on our site the index page is literally just A-Z in large letters. That's how you navigate. It's not like it's just some small text at the top, it's the entire page. To go from 26 giant letters to 50+ giant letter-combos would clutter everything up aesthetically IMHO. But while we have no real problems getting our pages crawled, I do agree that maybe it's worth trying if it'll get us out of this stupid penalty.


 8:30 pm on Jul 3, 2008 (gmt 0)

Say we have a good list of "latest articles" on each pages, which works in terms of user experience, but is probably just a lot of noise for search engines as it changes often, would it be wise to make them non spiderable in some way just to leave the natural main menu? If so what would be the best approach without using cloaking techniques? nofollow? iframes? javascript? I think the nofollow method would work with Google but not so much with the other engines.


 8:42 pm on Jul 3, 2008 (gmt 0)

Say we have a good list of "latest articles" on each pages, which works in terms of user experience, but is probably just a lot of noise for search engines as it changes often, would it be wise to make them non spiderable in some way just to leave the natural main menu?

Some will say that you could put them in an <iframe> which would probably produce the result you want. Yes? You can then do whatever you wish with the page in the src of the <iframe>. Maybe follow, noindex or something like that?


 10:32 pm on Jul 3, 2008 (gmt 0)

P1R, again, you have addressed something that has been in the back of my mind, and its the idea of crosslinking through-out the site that can compensate for a smaller menu, but the flip-side is that the revelation process is slower as the visitor has to traverse multiple pages before getting a better understanding of how much is available through out the site.
And on smaller sites where there aren't that many pages, with a possibility of some of those pages not being listed in any of the search indexs, what about the pages that are orphaned from the omittance?
That is probably the main reason why I use a sitemap, xml for engines, and html for humans. So that the one or two off pages have some way to get accessed...


 3:06 am on Jul 4, 2008 (gmt 0)

One quick observation on the purpose of menus on a site. (Other than the internal anchor bomb...)

Menu's should be used to get people to a section of the site which has an effective summary (back to the Pyramid post of Brett).

That is good in theory, but it does not work well on mega sites and we end up with menu structures that are complicated and impossible to navigate at times. Some menus are almost like playing video games.. click, hover for 5 seconds, move the mouse quickly. Tires people out eventualy.

As observed by some of the people here, we looked at the click through rates on the various menu items and people were using 7 out of the 40 odd css / dhtml menu we had in place.

We looked hard at how users prefer to navigate (and looked at how they used Amazon). The conclusion was, an effective site search and page layout works FAR better than menu driven navigation on e-commerce sites.

If you're still a believer in the long tail sale, there is very little you can do to guide the user to product you're looking for. You have to let them FIND it.

If you're using the menu's to get people to find products on an ecommerce site.. forget it. Look at a good site search engine.

Think "Blink" from a customer's perspective. I am of the opinion they'd rather just type a search into your search box, find the product and place it in their cart in a minute, rather than play some warped version of web-tetris to navigate through menus to find a product 10 mins later.


 1:32 pm on Jul 4, 2008 (gmt 0)

On my site the indication from GA is that people love the navigation links

And I have both google search and custom search, think people like to peruse a site when they have an easy obvious navigation system

Robert Charlton

 1:09 am on Jul 5, 2008 (gmt 0)

Menus should be used to get people to a section of the site which has an effective summary... ...good in theory, but it does not work well on mega sites.

...If you're using the menus to get people to find products on an ecommerce site.. forget it. Look at a good site search engine.

I think the point of "effective summary" is a powerful idea. Even with site search, you still need to get people to a good jumping-off place for search. Again, it's a question of giving the visitor an understanding of what's available on the site. This is perhaps more easily done an ecommerce site... less easily done on sites where a site is offering information that's conceptually new. I tend to use what I call "plateau pages" or mini-sitemaps to provide points of summarization.

...change or add navigation by industry or application, rather than by product.

One of my least favorite words in the English language is "Solutions," but this is a category on many technology sites where I see industry or application targeting.


 8:15 pm on Jul 5, 2008 (gmt 0)

IMO in these days of "perceived intent", mega menu users should ensure minimal indexing of "spurious" pages to avoid being perceived as attempting to "inflate" the site through internal anchors.


 4:08 am on Jul 6, 2008 (gmt 0)

The index page (180 kb) on the site that has occupied the top position in my very competitive key phrase for most of 5 years has more than 200 internal links; many with keyword rich anchor text. It is part of a large network of sites, and in the footer, has another 120 or so outbound links to the other sites in the network.

In the center section, a "bold headline" is a black colored internal link, followed by a very brief block of text, followed by a blue colored "more information" link. The "bold headlines" are 5 - 8 words long. Many of the words are repeats.

Anchor text may well be greater than content. It looks more like one giant menu cum links page than anything else.

The kicker, the domain name matches the key phrase.


 4:44 am on Jul 6, 2008 (gmt 0)

First I have to admit that I did not read every single post in detail, so forgive me if I am repeating something.

IMHO, I think the use of a mega-menu and any ranking results is proportional to the uniqueness of the site. I run a <niche widgets> directory with about 110 pages. There are about 30 main sections each averaging 3 sub-sections using a single level css/javascript menu. All the pages rank well and 99% of the time the site ranks in the top 8 SERPS. But I emphasize that this is a niche site/market.

If, however, a site is such that there are thousands of similar sites, I can understand where less is more and would not at all be surprised if it was true in SERPs and page ranking.

I think the bottom line is simply - what makes sense to a human makes sense to a spider. In the mad rush to insert as many keywords as possible to help ranking in the form of links, I believe developers have gone overboard and, in effect, have defeated their own purpose. I would say it rates up there with people abusing the meta keyword tag thus making it less important.


[edited by: Robert_Charlton at 6:32 am (utc) on July 6, 2008]
[edit reason] removed specifics [/edit]


 6:06 pm on Jul 6, 2008 (gmt 0)

If the website is laid out in a well-defined hierarchical format, like siloing, that usually takes care of excessive nav links pointing in, since most main topics do not have more than 20-30 subtopics that can be defined.

After that, it comes down to making sure you are not repeating the keyword in the nav links themselves.


 6:42 pm on Jul 6, 2008 (gmt 0)

If the website is laid out in a well-defined hierarchical format, like siloing.

Hmmm, I just read the dictionary definition of that and I'm not too certain that would be the proper term here. I know, I know, someone in the industry created that but I'm not going for it. If you read the definition of Information Siloing, I don't think it applies theoretically in this instance.

The silo effect is a phrase that is currently popular in the business and organizational communities to describe a lack of communication and common goals between departments in an organization.

Hehehe, I wonder if siloing from an SEO standpoint does the same thing. Duh Edward, that's the purpose of it! ;)

Also, the above process leaves way too many footprints from what I've seen. In some cases it uses "new protocols" to perform what should be done via other methods that are "natural" and not forced.

I do believe the Mega Menu is all relative. The bigger the site, the bigger the menus, its a natural progression. But, at some point, you reach critical mass and navigation becomes a real monster. If its a monster for you, it is most likely a monster for the indexers.

I'm still working my way toward the Wiki Mindset. I really dig the Wiki and the W3 believe or not. Those two sites have an internal navigation mechanism that is truly a piece of work.


 6:24 pm on Jul 7, 2008 (gmt 0)

Hi page. :)

I think that there may be variance between the base term and the way we use the term, without getting off topic too much from this thread. Since you like Wiki:

"a management system incapable of reciprocal operation with other, related management systems. A bank's management system, for example, is considered a silo if it cannot exchange information with other related systems within its own organization, or with the management systems of its customers, vendors or business partners."

In many respects we do not follow this rule as webmasters, but it is an excellent guideline when creating a website and understanding themes and overall hierarchy.

The more varied the topics are, the more sense in my opinion it makes to follow a silo'ed approach, where the various institutions only perform 'limited talking' when necessary.

I know there are exceptions to the rule, such as Wiki, but imho in general those exceptions are found on massive, authority websites.


 6:52 pm on Jul 7, 2008 (gmt 0)

The basic rule of "siloing" as I understand it is to minimize cross-linking between different silos at the deeper levels of content, and do so only at the top or category levels.

This might help in some mega menu websites. The reasons I've seen mega menus happen fall into a couple categories:

1. Fear that every visitor won't have every option
2. Loading up anchor text for the search engines
3. Wanting the website to be a good utility for the company's staff
4. Unwillingness to put in the energy/time/resources to develop a solid information architecture

None of these are good, user-oriented motivations.


 11:50 pm on Jul 7, 2008 (gmt 0)

The point of navigation is to navigate. You want your visitors to find things they are looking for, while benefitting yourself. If done right, what follows from this is direct SEO benefit, as things more people want are linked to more.

Having a nav link on a "yarn" page to your "bazookas" page is not going to be anything more than clutter to a user on the yarn page. A link to a "weapons" page is far more useful, as presumably anyone looking for a bazooka knows it is a weapon. Likewise, people on the bazooka page will more likely like to see a "speargun" link than a "yarn" one, but wouldn't have a problem with a menu with a "crafts" link or soemthing like that.

If you have a site about lots of things, you should do your visitors a favor and organize your navigation to send them easily to the 99% of things they are likely interested in that are related, while making the unrelated stuff that 1% might look for (the yarn user bazooka owners) findable but with more difficulty. You end up with links from related stuff to related stuff, and with more links to your popular stuff and less links to your obscure stuff.


 12:23 am on Jul 8, 2008 (gmt 0)

Right - exactly!

Many people worry about the ancient "3-click rule" that said a visitor should be able to go from any page to any other within 3 clicks. Well, that widely discussed "rule' from the '90s has been debunked in actual testing. the original rule was a nice guess, but unsupported by real data.

What matters to the user most is what Jakob Nielsen calls the "information scent". As long as the information scent for what they want keeps growing stronger, the user keeps clicking.

While I'm checking out the bazookas here, I should see if they have the cotton yarn my wife wants for her needlepoint. Let's see, Crafts [click]. Oh that's good, there's a Needlecrafts section [click]. Now i see Needlepoint [click]. And now there's a Yarn link [click]. OK, there's Cotton [click]. I'd better bookmark this page and ask her which colors she needs.

We all do it all the time - and so will most visitors if they have a purpose.


 8:16 pm on Jul 8, 2008 (gmt 0)

Some great posts in this thread that have got me thinking, especially SteveB and Tedster's comments on the 3-click rule.

Thinking about my own surfing habits, I'm in agreement with what he said.


 11:34 pm on Jul 10, 2008 (gmt 0)

A little while ago I tried to reduce my "SEO links" down to the bare minimum by adding nofollow to any navigational link that wasn't specific to the page. My non-supplemental pages dropped by about 15%. I changed everything back and the pages came back.

Maybe I was already below the optimal number of on-page links.


 10:54 pm on Jul 24, 2008 (gmt 0)

I'll have to watch out for this. I recently implemented an "accordion style menu" that has created a "mega menu" block of code with about 285 internal links on every page of my CMS-generated website, which has a total of about 30,000 pages including its inventory database driven content. So far, after two weeks, the only impact seems to be that G is indexing more pages faster than ever before, with no observable penalty, and traffic is increasing. Discounting the mega menu code, page content on every page, including individual inventory items, is pretty substantial, especially when you include multiple photos / graphics for each inventory item. That may be helping.


 1:06 pm on Aug 1, 2008 (gmt 0)

Ted, ever wonder if this is the old pure code issue of size. I know I used to believe that G wanted to see faster loading pages on sites and that a ratio of template code to visible text was part of the algo. The affect of that was deducible via on site time or per click visitor time from the tool bar data set.

This 71 message thread spans 3 pages: < < 71 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved