Welcome to WebmasterWorld Guest from 126.96.36.199
Type #1 just has a long laundry list of links down the left hand side (and often across the footer) and Type #2 uses a CSS hover menu so the page LOOKS neater, but the code still holds many, many links - especially if the menu goes into a second or third level.
Why could this create a problem? I have two ideas. First, we know that anchor text is a heavily weighted element in the algorithm. All that opportunity to go over the top with keywords in anchor text cannot be a good thing. And second, Google reps keep repeating the advice not to go beyond 100 links on a page. Even if more links are now spidered, just think of the pile of semantic confusion that this can throw at the relevance calculations.
So I started looking at the client sites that are really cooking - and guess what, they often are between 30 to 40 links on the home page, no more.
On one well ranked site I was called to help with, we improved even further by dropping from about 60 links in the template to only 22.
Using their analytics program we discovered that one of the home page links got over 60% of the clicks, and 10 of them got 99%! The other were there for "SEO" purposes, but the customers apparently could care less. So we dropped a ton of that anchor text from the site template, and saw rankings get even better on the best terms.
On another site, there was a 3-level hover menu that, even before any content area links, offered 130 anchor tags. We re-thought the Information Architecture and moved the site into a classic "inverted L" format, dropping down to about 45 links per page. Again, everything about the site started to perform better, rankings, stickiness, conversions, you name the metric.
VISITORS LIKE IT TOO
Even without SEO concerns, hover menus can be a problem for visitors. For one thing, they cannot see all their options at any one time. For another, it's easy to end up with several links that have the same anchor text but point to different pages. Now that's confusing both for people and for algorithms!
It can be mind-bending work to to generate a good Information Architecture -- and even harder work to choose optimal menu labels. It's much easier to grow a site by just slapping more links onto the menu. However, from what I've seen, slim and trim is the way to go with menus. Google seems to like it a lot, and a visitor's first impression of such a site is "I can deal with this", not "sheesh, where do I start?"
I'm not saying you can't succeed with a Mega Menu, only that it can be more problematic. Especially when you've been programmed with the SEO mantra of "links, keywords, anchor text, links, keywords, anchor text" for many years, it's easy to go wrong without even noticing.
Anyone else have experiences with different sizes of menu? Maybe you have results that run counter to mine, or may they support mine. I know that we each see our own particular slices of the total web and not others.
Whatever the case, I'm interested in how my Mega Menu idea looks across many markets.
[edited by: tedster at 3:18 am (utc) on July 2, 2008]
I just checked my home page on a large site that had penalty problems most of which have been lifted. I did Find > href and found 116 links!
(This due to years of gradual building.)
Then I went to Google Analytics > Content Overview > Site Overlay. I found many of the home page links receive very few clicks (0.1% or lower).
I think I'll look into Website Architecture and try to put together a new navigation structure on the home page. The home page gets many hits from Google so I might keep the text but remove 90% of the links. (Or convert the links into URLs just to keep Google happy. :/)
It's only my home page that has 100+ links. The other pages have no more than 40. (The home page is similar to a sitemap, but I don't have an official sitemap.)
Design and content guidelines
* Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
* Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
* Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
* Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
* Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.
* Make sure that your <title> elements and alt attributes are descriptive and accurate.
* Check for broken links and correct HTML.
* If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
* Keep the links on a given page to a reasonable number (fewer than 100).
Keep the links on a given page to a reasonable number (fewer than 100).
This guideline is easier to violate than it may first appear. I just took a quick look at a site today that was having ranking troubles and discovered that the Home Page had 1901 links! It was rather well laid out and from a casual look at the page you would never have realized that it was so link-heavy.
[edited by: tedster at 5:48 am (utc) on Nov. 26, 2008]
How much can Link Dilution play a part in ranking? If a home page has good potential to help other pages rank better (it's often the best ranking page on a site), but you link out to 1901 pages or whatever, is the link juice to each page calculated as a fraction of the 1901 pages?
I think this idea was once proposed for PR but I wonder if the same principle applies to ranking, too--both internal and externals links. If so, it could explain how reducing links would result in better ranking for major internal pages/categories.
Too many Home Page links can also contribute to gray bar on the category page. People get so knee-jerk about the idea that a Home Page link will help a new page to rank that they go overboard and destroy the intelligence that was embedded in their original structure.
So trimming it back and restoring a sensible logic to the link structure and architecture helps eveery signal on the site to become clearer.
Based on that in part I'm not so concerned about losing link juice from the home page to other pages, so I'm cutting back from over 100 HP links to just over 50.
I think you were saying a while back, Tedster, that site architecture is often best when it has a navigation structure of 5-7 user choices. So if you can categorize everything into that few directories, it's a good starting point.
I don't know if I can narrow it down to that many, but I know there are some directories on my site that I can make subdirectories.
I agree with the earlier comments that suggested making primary categories into main directories, and then the rest in subdirectories, where each subdirectory is pretty independent. Then you don't go overboard on the total number of internal links/page.
That seems to be most logical for users and one could imagine also make it easiest for Google's algo to figure out or spider a site. A very "clean" navigation structure with lots of natural layers.
So many webmasters fly with the notion that the more links to deep pages their are on their homepage, the more pages will rank, but this simply is not the case, and often I find websites organized much better from the onset, with logical topical steps (much like page headers) and proper bread crumbing outperform and are less prone to fluctuations than websites with less organized hierarchies.