homepage Welcome to WebmasterWorld Guest from 54.197.189.108
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Link Development
Forum Library, Charter, Moderators: martinibuster

Link Development Forum

    
How many links are allowed per page
DogHealth




msg:3735024
 4:06 pm on Aug 30, 2008 (gmt 0)

Hi Guys,

I saw Axxe asking the same question on his posting found on this page [webmasterworld.com...]

Anybody knows the answer to this question. As far as I know I think its best to limit it to not more than 100 links per page which I have strictly implemented in my blog. Its impossible to tell if its hurting or helping my blog since I would consider it as pretty new. Do you think its wise to actually limit it to 100 per page or I can put as my as I want and pleasing to my eyes; this is according to one of my friend? Please advise..:)

[edited by: tedster at 4:25 pm (utc) on Aug. 30, 2008]

 

martinibuster




msg:3735053
 4:44 pm on Aug 30, 2008 (gmt 0)

>>>As far as I know I think its best...

Do you know where this number (100) came from? There is a lot that link builders do that is following without knowing why. For instance, the preference for PR 4+ pages. Where did that four come from? Why four?

It's good that you're asking. :) It's worse to follow what others are doing without knowing the reasons for what you do. The number 100 originated from statements Matt Cutts (Google engineer) has made about limiting outgoing links, and later incorporated into Google's webmaster guidelines [google.com].

Now if you run along and do as Google says that's the end of the story, isn't it? Personally, I feel that much of what Google publishes are suggestions (guidelines), not scripture. So there's room to wriggle if you feel that 101 links is best for your outgoing links or sitemap. :P

[edited by: martinibuster at 4:51 pm (utc) on Aug. 30, 2008]

Lord Majestic




msg:3735054
 4:47 pm on Aug 30, 2008 (gmt 0)

It's probably not making any serious difference - on average each on the Web each page has got 50 outlinks, this is average so some pages will have a lot less and others will have a lot more. Spammy pages would often have thousands of outlinks, so you definately don't want that but if you are thinking to increase number of links to 120-150 then I doubt it will change anything fundamentally.

The less outlinks you have however, the more PR juice each one of them will get. So by adding more outlinks you will reduce PR that is passed down them.

Western




msg:3735089
 6:31 pm on Aug 30, 2008 (gmt 0)

50 outlinks

Where did that number come from LM?

tangor




msg:3735093
 6:39 pm on Aug 30, 2008 (gmt 0)

Confess my ignorance... outlinks=OFFSITE? (don't have those) I have several index pages with 100+ site internal links and don't seem to have a problem.

Lord Majestic




msg:3735117
 7:14 pm on Aug 30, 2008 (gmt 0)

Outlinks refer to all outgoing links - including internal :)

Western: this figure is based on our stats that come from over 30 bln crawled web pages (in total rougly 1.5 trillion links between all pages - makes 50 on average).

DogHealth




msg:3735459
 3:16 pm on Aug 31, 2008 (gmt 0)

martinibuster - Thanks a lot for replying to my post. I think your answer is what I am looking for because you were able to merge your personal views with Google’s guidelines. I think your right about saying that Google guidelines are really just “guiding principle”, that’s why I wanted to know the personal opinions of people on this subject matter. And by the way, is PR 4 really the best? Personally I don't want PR 4, if I can have PR 8 it is so much better. :)

Lord Majestic - I actually heard that outbound links especially with No PR can harm your sites own PR. And someone even suggested to me to remove the archives portion of my blog and reduce the categories. All though sometimes I think I am sacrificing the user friendliness of my blog for my visitors. I think if they are visiting my site and are looking for specific information, it would be best if I could have simply presented and sorted it out specifically through my categories and archives. I guess its right to say that you can’t have it all. Anyway lets see what it can do for my PR on the next update :)

debram




msg:3735728
 3:29 am on Sep 1, 2008 (gmt 0)

Do you know where this number (100) came from? There is a lot that link builders do that is following without knowing why.

This might help clear up the 100 link issue, from an interview with Stephen Spencer did with Matt Cutts:

Matt Cutts: I would recommend that people run experiments, because, if you have 5,000 links on a page, the odds that we would flow PageRank through those is kind of low. We might say at some point: that is just way too many thousands of links. And at some point, your fan-out is so high that the individual PageRank going out on each one of those links is pretty tiny.

I will give you a little bit of background - and I encourage people to run experiments and find what works for them. The reason for the 100 links per page guideline is because we used to crawl only about the first 101 kilobytes of a page. If somebody had a lot more than a hundred links, then it was a little more likely that after we truncated the page at a 100 kilobytes, that page would get truncated and some of the links would not be followed or would not be counted. Nowadays, I forget exactly how much we crawl and index and save, but I think it is at least, we are willing to save half a megabyte from each page.

So, if you look at the guidelines, we have two sets of guidelines on one page. We have: quality guidelines which are essentially spam and how to avoid spam; and we have technical guidelines. The technical guidelines are more like best practices. So, the 100 links is more like a 'best practice' suggestion, because if you keep it under a 100, you are guaranteed you are never get truncated.

stephanspencer.com/search-engines/matt-cutts-interview

martinibuster




msg:3735743
 4:07 am on Sep 1, 2008 (gmt 0)

Debra, might want to clarify your post. Matt says, Google "used to" crawl only the first 101 KB. You state, "because if you keep it under a 100, you are guaranteed you are never get truncated." Do you mean 100 links or 100 KB? Whichever case it is, a webmaster would have to post a lot of links to get the page to weigh half a megabyte.

As a sidenote, the interview is a good example of how SEO Myths begin. There's an Indian SEO blog out there from January of this year stating that Google only crawls the first 101 KBs.

This is what I meant by my statement, "There is a lot that link builders do that is following without knowing why." The one hundred number has been around years before Stephan's interview (which is a good one and worth reading). It's something Matt has talked about for a long time, has been on Google's webmaster guidelines for years as well. And we've been talking about it for over five years. I found threads about this going back to 2002. Here's a thread from January 2003 about the 100 link limit [webmasterworld.com].

debram




msg:3736063
 5:33 pm on Sep 1, 2008 (gmt 0)

Roger, I didn't do a good job of denoting what portions were quoted and which were not. Let me try again because almost everything is from MattCutts, not me!

Matt Cutts: I would recommend that people run experiments, because, if you have 5,000 links on a page, the odds that we would flow PageRank through those is kind of low. We might say at some point: that is just way too many thousands of links. And at some point, your fan-out is so high that the individual PageRank going out on each one of those links is pretty tiny.

I will give you a little bit of background - and I encourage people to run experiments and find what works for them. The reason for the 100 links per page guideline is because we used to crawl only about the first 101 kilobytes of a page. If somebody had a lot more than a hundred links, then it was a little more likely that after we truncated the page at a 100 kilobytes, that page would get truncated and some of the links would not be followed or would not be counted. Nowadays, I forget exactly how much we crawl and index and save, but I think it is at least, we are willing to save half a megabyte from each page.

So, if you look at the guidelines, we have two sets of guidelines on one page. We have: quality guidelines which are essentially spam and how to avoid spam; and we have technical guidelines. The technical guidelines are more like best practices. So, the 100 links is more like a 'best practice' suggestion, because if you keep it under a 100, you are guaranteed you are never get truncated.

stephanspencer.com/search-engines/matt-cutts-interview

leela




msg:3737149
 8:33 am on Sep 3, 2008 (gmt 0)

It still says on [google.com...]

"Keep the links on a given page to a reasonable number (fewer than 100)."

and
"If the site map is larger than 100 or so links, you may want to break the site map into separate pages."

So it seems that no matter how much of a page google is crawling currently they advise to ot have more than 100 links on a page.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Link Development
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved