homepage Welcome to WebmasterWorld Guest from 54.226.80.55
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google's "100 links per page" guideline
Gemini23




msg:3814033
 5:52 pm on Dec 24, 2008 (gmt 0)

I believe that Google says no more than 100 'outlinks' but I am not sure if this means links to external sites or links to other pages within the domain? or doesn't it matter which? should it be that not more than 100 links period?

[edited by: tedster at 3:24 pm (utc) on Dec. 26, 2008]
[edit reason] moved from another location [/edit]

 

tedster




msg:3814048
 6:52 pm on Dec 24, 2008 (gmt 0)

Design and content guidelines...

  • Keep the links on a given page to a reasonable number (fewer than 100).

    Google Webmaster Guidelines [google.com]


  • santapaws




    msg:3814263
     11:31 am on Dec 25, 2008 (gmt 0)

    but it says only for site maps to keep below 100 links

    "Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages."

    tedster




    msg:3814297
     1:49 pm on Dec 25, 2008 (gmt 0)

    That's in a different spot, santapaws. The quote I gave is a content guideline, not just a sitemap suggestion.

    santapaws




    msg:3814515
     10:15 am on Dec 26, 2008 (gmt 0)

    duh, yeah i see it now. This is something i really have a hard time dealing with. Its one thing saying we (google) have formed a view on how to crawl and we may not crawl all 100 links on a page. Its a totally different thing to take a subjective view and make it a hard and fast part of a quality statement. Why is 100 links the cut off point for a quality sign? Why is going above 100 UNreasonable? Would wikipedia exsist if they were run by seo's who lived or died by the google guidelines? I would like them to expand on just how they came to this view to make 100 the point of no-return. I cant help feeling it was more a derivative from the crawl limitations/fine tuning than real world fact.

    tedster




    msg:3814534
     1:18 pm on Dec 26, 2008 (gmt 0)

    I don't think there's anything "hard and fast" about this "Design and content guideline". I see it as a helpful suggestion for creating pages that do well in Google Search. One hundred links is not a point of no return, bur rather a rule of thumb to help webmasters create a page that is 1) more usable for visitors 2) more likely to be measured as relevant for the right keywords.

    The "Quality guidelines" further down the page that I linked to are more like hard and fast rules to follow. As Google writes there, "These quality guidelines cover the most common forms of deceptive or manipulative behavior."

    santapaws




    msg:3814571
     3:16 pm on Dec 26, 2008 (gmt 0)

    except that they do actually state that 100 is reasonable and thus by association going over 100 is not reasonable. This i do not follow. I can only come back to one of the webs most used websites, wikipedia. How does that 100 link logic fit with the real world cases? Im sorry but you can only take a less favourable view of pages with more than 100 links if you have made links the core basis of the algo. So as i said, i take this guideline more of a guide to fitting with googles algo and crawling than a real world guideline on how t make a website users will enjoy using.
    I am not knocking google, i love google, without it would need to get a life. However on this particular issue i believe they have got it wrong.

    yes i know you are saying they are just helping webmasters make their website rank/place better with the searches, but this can only be the case if more than 100 links are having a negative affect.

    tedster




    msg:3814582
     3:43 pm on Dec 26, 2008 (gmt 0)

    Google is giving us a helpful guideline here. They're not telling us that they've decided to look down on pages with more than 100 links. Wikipedia is clearly one example of a site where the "limit" does not always cause problems. Other exceptions are easy to find as well.

    And yet, the evidence emerges in practice over many urls (and not just with Google) that search engines can do a better job when the number of links on a page is kept lower. It's a practical thing - an artifact of Information Retrieval technology today - rather than a value judgement of any kind.

    this can only be the case if more than 100 links are having a negative affect.

    And indeed that is what I often see. Semantic confusion and poor search relevance are the risks, especially for keywords that are not in any backlink anchor text for the url.

    Google is not intentionally taking "a less favourable view" of such pages - this guideline is not coming out of an intentional decision. As I described above, it's a kind of artifact, showing the limitation of current IR technology.

    We have a thread that discusses this: The Mega Menu Problem [webmasterworld.com], linked from our Hot Topics area [webmasterworld.com], which is always pinned to the top of this forum's index page.

    santapaws




    msg:3814596
     4:05 pm on Dec 26, 2008 (gmt 0)

    yes then i agree, it has more to do with current limitations and conflicts on the search engine side. But im not sure this is correctly presented in the guidlines. What i find now is that pages with more than 100 links are being called spammy by less than knowledgeable website owners and webmasters. And all because google has named 100 links as the tipping point. Mega menus that have been proven in terms of usability over time are being requested to be broken down to keep under this 100 limit by owners. I always used to come at the design angle from what i like to see on a page as a user, you can no longer do that. Id just be a little happier if google could make it a little clearer why 100 has been picked as the recommendation for their guidelines.

    tedster




    msg:3814610
     5:19 pm on Dec 26, 2008 (gmt 0)

    In addition to the IR challenge that having many links creates, there's also another very practical side. If you take a page with lots of links and do some tracking on what links are drawing clicks from visitors, it usually become quite clear very quickly that many of those links are not something that visitors use.

    There's a principle that I learned before the web came around that seems to import to websites quite well: Too many choices often generates no choice at all.

    In my view, a website is a way of packaging information. Making that package intuitive and usable is the crux of the matter. This all points to Information Architecture - a sometimes challenging area, yes, but one that is well worth the effort, in my experience. A good Information Architecture keeps all kinds of users happy, both carbon-based intelligence and silicon-based.

    Quadrille




    msg:3814616
     5:26 pm on Dec 26, 2008 (gmt 0)

    Why would you want more than 100 links on a page?

    As a visitor, I don't see the point; as a webmaster, my immediate instinct would be to subdivide the page for fear of confusing / losing the visitors.

    And even more than that, if the pages were a template design, I'd fear a duplicate content problem.

    Yes, I'm sure there are exceptions; but 100 is more than reasonable, I reckon!

    pageoneresults




    msg:3814621
     5:45 pm on Dec 26, 2008 (gmt 0)

    But rather a rule of thumb to help webmasters create a page that is 1) more usable for visitors 2) more likely to be measured as relevant for the right keywords.

    That's the key point I've taken away from these discussions. As one of those users myself, I can tell you that pages loaded with links are somewhat difficult to traverse.

    I'm sticking with the early 1990s approach of keeping things simple. It's back to the basics. ;)

    Gemini23




    msg:3814623
     5:54 pm on Dec 26, 2008 (gmt 0)

    Personally I think that it really does depend on the type of website that you are providing and what service it gives. I have recently reduced my number of links on the homepage BUT I am not convinced that was THE reason for a recent overall climb in rankings (still mulling that over).

    Out of interest for one popular key phrase that I use.. the outlinks on the top ten websites are as follows:
    1. 122
    2. 121
    3. 165
    4. 160
    5. 201
    6. 134
    7. 145
    8. 340
    9. 132
    10. 161

    and.. how about this... no. 100 527 outlinks! (maybe that is they they are no. 100!

    Is there pattern in the top ten... I don't think so.. Could the structure and qality of the linking be more important than the number of links? or how about a ratio between outlinks and inlinks?

    Quadrille




    msg:3814629
     6:12 pm on Dec 26, 2008 (gmt 0)

    And how many of those links actually get used?

    More to the point, how many NEVER get used, and therefore could be safely removed?

    I'm with P1R ... kepp it simple; why look for trouble?

    ken_b




    msg:3814637
     6:23 pm on Dec 26, 2008 (gmt 0)

    So when do you fix what doesn't seem to be broken?

    I've got a page that has grown to over 1,000 links over the years. These are almost all simple organization name as anchor text links with no annotation other than city & state.

    Still maintains it's TBPR. Still attracts link requests.

    Still ranks in the top 10 for it's topic (2 & 3 word terms)
    Still one of my more popular pages.

    Easy enough to use via on page navigation.

    Could easily be broken into 52 separate pages.

    I know times change, but with this page I keep thinking back to the "if it aint broke, don't fix it" concept.

    What to do.... choices....choices....choices!

    tedster




    msg:3814650
     7:17 pm on Dec 26, 2008 (gmt 0)

    I agree with ken_b: if it ain't broke don't fix it, especially not because of some theoretical ideal. This kind of "list of things" link page can be very helpful to the end user - and you probably don't care if this url gets search traffic at any rate, which is the main point of the guideline.

    I work with a few sites that have internal link lists like this - they're often in alphabetic order or geographic order. I just checked a few of them so my comments could be up to date. What I discovered is no, these urls do not rank well, even the ones that haven't yet gone graybar. But the pages are a convenience to the user, and the target pages for these links DO rank well, which is what I really care about.

    That said, I've recently come across a site that included a link list utility right in the template for every page. That utiliy list alone contained several THOUSAND links, and the site had next to no Google traffic. Making all those links nofollow had a dramatic effect on improving search traffic.

    So it's a guideline. It's good to keep in mind, and it's also good to know when and why you didn't follow it to the letter. Being unconscious of what you are doing can create trouble and mystery.

    santapaws




    msg:3814656
     7:33 pm on Dec 26, 2008 (gmt 0)

    this really is the point i was making. That sites thay have done very well with large menus or index/list of things pages are being canobolised not for the user but to follow this 100 links guideline. Perhaps if it was made clearer that this has less to do with spam and more to do with the way the engines crawl some panicky owners would be less hasty to make less user friendly changes. My point was also that it makes sense for SEO only because thats how its factored into the algo. It really isnt something to slap someone down with because they like to go over 100 links. These pages dont have to look cluttered if its part of a mega menu thanks to drop downlists, hidden menus and the like.

    I'm sticking with the early 1990s approach of keeping things simple. It's back to the basics. ;)

    didnt they also say a page should never be more than 2 clicks away?

    tedster




    msg:3814666
     8:17 pm on Dec 26, 2008 (gmt 0)

    didnt they also say a page should never be more than 2 clicks away?

    That particular suggestion was for the end user, not for search engines. It was tested in 2003 (here's our thread about that [webmasterworld.com]) and found not to be true at all.

    That's a classic example of the value of testing - and most webmasters had never actually tested it, they just bought into it. Just so, this 100 link guideline should be tested to see how it applies in your case. If you are not seeing any ranking problems with the current number of links you have on a page, then why make a change? In essence, you already have a test case that is working!

    santapaws




    msg:3814668
     8:22 pm on Dec 26, 2008 (gmt 0)

    Hey Ted, this isnt an issue for me. I just have a view on it having seen how others i work alongside have reacted to it. People tell me how its bad to have 100 links, so i ask why, and they tell me because google says so!

    Quadrille




    msg:3814719
     10:27 pm on Dec 26, 2008 (gmt 0)

    ... well, there is the visitor confusion / interest factor to consider first; if the vast number of links requests a huge page, that's a good reason to review it.

    While there is no guideline (and no real consensus) on 'perfect' page size, there's a fair bit of evidence that too small is not good (visitors get repetitive click frustration, SEs may get duplicate content issues) and too big is not good (visitors get repetitive scroll damage and SEs simply get bored, and cannot focus on what YOu think is important).

    None of this is about rules; much isn't even about guidelines. A lot of it is simply what 'looks right', what 'feels right', and what works. (translation: common sense!).

    And common sense says you don't have to mindlessly follow Google guidelines, but if you choose to diverge - as most of us do from time to time - you have to accept that there may be consequences. Or not ;)

    Gemini23




    msg:3814734
     11:03 pm on Dec 26, 2008 (gmt 0)

    My main landing page is still my homepage and as such gets found by certain search terms which then lead to a 'buy' page. It is therefore quite often that visitors might not use ANY of the 100+ links on the page.. athough they are there for those few customers that require the additional information down those links and 'buy buttons' from those pages. The number of links does I believe also have another effect, in that it promotes all other pages that are being linked to and raises their serps (given that the link structure in the website is sound). For example, one page linked from the homepage doesn't get many visitors directly from the homepage BUT that page is ranked at no. 4 in Google for a term that gets searched for. I believe that as well as good content, it is the site links that help that page. BUT I do still have my 'L' plates on as far as seo goes! (and haven't got the car out of the garage yet!)

    wwsn




    msg:3814762
     3:18 am on Dec 27, 2008 (gmt 0)

    does nofollow links count towards this "100 links" guide line as well?

    pageoneresults




    msg:3814763
     3:23 am on Dec 27, 2008 (gmt 0)

    Welcome to WebmasterWorld wwsn!

    does nofollow links count towards this "100 links" guide line as well?

    After a recent discussion on the rel="nofollow" attribute, I'm inclined to believe that "no", anything tagged with nofollow is removed from the equation.

    But, the bot still has to index the a href to process the nofollow directive?

    I'm so confused... ;)

    nealrodriguez




    msg:3873337
     3:41 pm on Mar 18, 2009 (gmt 0)

    i think cutts has finalized this argument on an article i cited on this thread:

    [webmasterworld.com...]

    Robert Charlton




    msg:3873950
     5:06 am on Mar 19, 2009 (gmt 0)

    Here's the referenced Matt Cutts article:

    How many links per page?
    [mattcutts.com...]

    Matt goes into the history of this guideline, which is that the number 100 "seemed right" for the 100 Kb size limitation Google also recommended. Google continues the recommendation now, even though it indexes more than 100 Kb, because of user experience considerations and the "minuscule amount of PageRank" each link is likely to pass along.

    Matt also takes pains to point out that a link heavy page isn't automatically considered spam, but hidden links or "keyword-stuffed" links are considered spammy.

    I've had pages like ken_b's with a huge number of links that worked very well, both for search and for users. The site was a directory style site, geographically and alphabetically organized. We provided some white space on these pages and segmented them for easy user scanning.

    But, I should note, this was also high PR site with a well-organized hierarchy, and these link pages were at the bottom of the chain, pointing to very specific resources.

    [edited by: Robert_Charlton at 5:11 am (utc) on Mar. 19, 2009]

    norton j radstock




    msg:3873974
     5:48 am on Mar 19, 2009 (gmt 0)

    A couple of thoughts on this -if you stick to the 100 links and combine it with the two-click rule to any page, then that potentially gives 100 x 100 x 100 pages -one million pages is an awful lot of content and should satisfy all but the most significant websites.

    This discussion also made me do a check on one of my sites which I was surprised to find had over 30 internal links per page for site navigation. I don't regard it is particularly heavily linked, but if internal links are counted in the recommendation, then this does significantly reduce the number of external links per page.

    tedster




    msg:3873998
     6:48 am on Mar 19, 2009 (gmt 0)

    combine it with the two-click rule to any page

    You're calculations are right on. Add in the fact that the 2-click rule (or the 3-click rule) has been pretty well discredited by solid research, and you've removed almost every excuse there is.

    Shaddows




    msg:3874057
     9:18 am on Mar 19, 2009 (gmt 0)

    then that potentially gives 100 x 100 x 100 pages -one million pages is an awful lot of content

    You're calculations are right on

    Not to disagree here but if you're saying your can ONLY get to each indiviual page from its parent or daughter, thats pretty restrictive. Any but the most 'academic' drill-down-for-more-data type sites would suffocate and die under that structure.

    [edited by: Shaddows at 9:18 am (utc) on Mar. 19, 2009]

    misterjinx




    msg:3878446
     2:10 pm on Mar 25, 2009 (gmt 0)

    In an interview of over a year ago "questions answers with google's spam guru" Matt Cutts told you can have over 100 links per page but it's not useful for the user.

    He then explained that initially there was a problem of spiderization because the spider skipped over first 100kb of a web page but the problem is solved.

    I add two personal notes.

    The first one is about megamenu. I see an evolution from September 2008. It became an (usually) algorithmic -40 penalty.
    You can recover fast in about 3 days.

    The second note is a question. Is it possible a relationship between PageRank and megamenu ? Because trusted and visited web sites like those of we known internet or computer companies have a lot of links but no problems of penalization.

    Finally Matt Cutts today declared Google knows many patterns in web design. So I think we should take note of this.

    A recent patent of Google is about identifying published yesterday is about a web spam page and link evaluations and tell us about the

    So I think a pattern exists in Google and it should sound like: "if a web page has many links ordered in an unsorded list it's better considering it spam".

    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved