homepage Welcome to WebmasterWorld Guest from 54.205.144.231
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 71 message thread spans 3 pages: < < 71 ( 1 2 [3]     
The "Mega Menu" Problem and Google Rankings
tedster




msg:3687530
 8:47 am on Jul 1, 2008 (gmt 0)

Over the past year and a half, I worked with a number of -950 problems. In doing that, I noticed that many of these sites (though not all) had a characteristic that I started to call the Mega Menu. It comes in two forms, but both of them place a lot of anchor text on every page.

Type #1 just has a long laundry list of links down the left hand side (and often across the footer) and Type #2 uses a CSS hover menu so the page LOOKS neater, but the code still holds many, many links - especially if the menu goes into a second or third level.

Why could this create a problem? I have two ideas. First, we know that anchor text is a heavily weighted element in the algorithm. All that opportunity to go over the top with keywords in anchor text cannot be a good thing. And second, Google reps keep repeating the advice not to go beyond 100 links on a page. Even if more links are now spidered, just think of the pile of semantic confusion that this can throw at the relevance calculations.

So I started looking at the client sites that are really cooking - and guess what, they often are between 30 to 40 links on the home page, no more.

SITE REDESIGNS
On one well ranked site I was called to help with, we improved even further by dropping from about 60 links in the template to only 22.

Using their analytics program we discovered that one of the home page links got over 60% of the clicks, and 10 of them got 99%! The other were there for "SEO" purposes, but the customers apparently could care less. So we dropped a ton of that anchor text from the site template, and saw rankings get even better on the best terms.

On another site, there was a 3-level hover menu that, even before any content area links, offered 130 anchor tags. We re-thought the Information Architecture and moved the site into a classic "inverted L" format, dropping down to about 45 links per page. Again, everything about the site started to perform better, rankings, stickiness, conversions, you name the metric.

VISITORS LIKE IT TOO
Even without SEO concerns, hover menus can be a problem for visitors. For one thing, they cannot see all their options at any one time. For another, it's easy to end up with several links that have the same anchor text but point to different pages. Now that's confusing both for people and for algorithms!

It can be mind-bending work to to generate a good Information Architecture -- and even harder work to choose optimal menu labels. It's much easier to grow a site by just slapping more links onto the menu. However, from what I've seen, slim and trim is the way to go with menus. Google seems to like it a lot, and a visitor's first impression of such a site is "I can deal with this", not "sheesh, where do I start?"

I'm not saying you can't succeed with a Mega Menu, only that it can be more problematic. Especially when you've been programmed with the SEO mantra of "links, keywords, anchor text, links, keywords, anchor text" for many years, it's easy to go wrong without even noticing.

Anyone else have experiences with different sizes of menu? Maybe you have results that run counter to mine, or may they support mine. I know that we each see our own particular slices of the total web and not others.

Whatever the case, I'm interested in how my Mega Menu idea looks across many markets.

[edited by: tedster at 3:18 am (utc) on July 2, 2008]

 

Brett_Tabke




msg:3712934
 1:06 pm on Aug 1, 2008 (gmt 0)

Ted, ever wonder if this is the old pure code issue of size. I know I used to believe that G wanted to see faster loading pages on sites and that a ratio of template code to visible text was part of the algo. The affect of that was deducible via on site time or per click visitor time from the tool bar data set.

tedster




msg:3712955
 1:26 pm on Aug 1, 2008 (gmt 0)

That certainly can play in, Brett - especially if some of the more bloated hover menu scripts are included inline. I no longer thik that a pure text-to-code ratio is in play, but as you said, other loading speed and response time measures do the same job in more depth. Some people are now reporting slow response time messages for urls in their WMT accounts.

youfoundjake




msg:3713612
 3:28 am on Aug 2, 2008 (gmt 0)

Just to follow up on this,
I removed the javascript menu's and the snippet being displayed is now from the actual meta description, as oppossed to the menu it self, acutally makes the site: command a lot nicer looking, heeh...

potentialgeek




msg:3792813
 8:23 am on Nov 24, 2008 (gmt 0)

Just found this old thread from a recent link by Tedster in another thread.

I just checked my home page on a large site that had penalty problems most of which have been lifted. I did Find > href and found 116 links!

(This due to years of gradual building.)

Then I went to Google Analytics > Content Overview > Site Overlay. I found many of the home page links receive very few clicks (0.1% or lower).

I think I'll look into Website Architecture and try to put together a new navigation structure on the home page. The home page gets many hits from Google so I might keep the text but remove 90% of the links. (Or convert the links into URLs just to keep Google happy. :/)

It's only my home page that has 100+ links. The other pages have no more than 40. (The home page is similar to a sitemap, but I don't have an official sitemap.)

p/g

nickreynolds




msg:3793242
 8:08 pm on Nov 24, 2008 (gmt 0)

This is a great thread -- very informative. Thank you to all who have contributed

potentialgeek




msg:3794367
 4:31 am on Nov 26, 2008 (gmt 0)

To quote from the authority on this subject, since it hasn't been done yet in this thread:

Design and content guidelines

* Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

* Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

* Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

* Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

* Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.

* Make sure that your <title> elements and alt attributes are descriptive and accurate.

* Check for broken links and correct HTML.

* If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

* Keep the links on a given page to a reasonable number (fewer than 100).

Source: [google.com...]

p/g

tedster




msg:3794378
 4:57 am on Nov 26, 2008 (gmt 0)

Keep the links on a given page to a reasonable number (fewer than 100).

This guideline is easier to violate than it may first appear. I just took a quick look at a site today that was having ranking troubles and discovered that the Home Page had 1901 links! It was rather well laid out and from a casual look at the page you would never have realized that it was so link-heavy.

[edited by: tedster at 5:48 am (utc) on Nov. 26, 2008]

potentialgeek




msg:3794390
 5:16 am on Nov 26, 2008 (gmt 0)

Tedster,

How much can Link Dilution play a part in ranking? If a home page has good potential to help other pages rank better (it's often the best ranking page on a site), but you link out to 1901 pages or whatever, is the link juice to each page calculated as a fraction of the 1901 pages?

I think this idea was once proposed for PR but I wonder if the same principle applies to ranking, too--both internal and externals links. If so, it could explain how reducing links would result in better ranking for major internal pages/categories.

p/g

tedster




msg:3794425
 5:59 am on Nov 26, 2008 (gmt 0)

p/g, I think that's a major issue. I'm pretty sure that PR affects how much weight other link related factors can have.

Too many Home Page links can also contribute to gray bar on the category page. People get so knee-jerk about the idea that a Home Page link will help a new page to rank that they go overboard and destroy the intelligence that was embedded in their original structure.

So trimming it back and restoring a sensible logic to the link structure and architecture helps eveery signal on the site to become clearer.

potentialgeek




msg:3794435
 6:24 am on Nov 26, 2008 (gmt 0)

I was surprised to find one page on my site is ranking well without getting a link from the home page (and no IBLs to it). It's just a long page with a lot of content and 100+ internal links to it.

Based on that in part I'm not so concerned about losing link juice from the home page to other pages, so I'm cutting back from over 100 HP links to just over 50.

I think you were saying a while back, Tedster, that site architecture is often best when it has a navigation structure of 5-7 user choices. So if you can categorize everything into that few directories, it's a good starting point.

I don't know if I can narrow it down to that many, but I know there are some directories on my site that I can make subdirectories.

I agree with the earlier comments that suggested making primary categories into main directories, and then the rest in subdirectories, where each subdirectory is pretty independent. Then you don't go overboard on the total number of internal links/page.

That seems to be most logical for users and one could imagine also make it easiest for Google's algo to figure out or spider a site. A very "clean" navigation structure with lots of natural layers.

example.com/directory
example.com/directory/subdirectory

p/g

CainIV




msg:3794440
 6:51 am on Nov 26, 2008 (gmt 0)

By nature, *most* topics do not inherently have over 50 logical related, expandable sub-topics, so in my opinion, 90% of the websites out there would not have a need to add many more links, and would be much better instead organizing lesser important related plots underneath major categorical plots and supporting in a pillar format.

So many webmasters fly with the notion that the more links to deep pages their are on their homepage, the more pages will rank, but this simply is not the case, and often I find websites organized much better from the onset, with logical topical steps (much like page headers) and proper bread crumbing outperform and are less prone to fluctuations than websites with less organized hierarchies.

This 71 message thread spans 3 pages: < < 71 ( 1 2 [3]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved