Forum Moderators: open
:-(
If this rule was true what would happen to sites like yahoo and ODP?
I think the 100 per page rule sounds a bit more realistic, although hard to prove. A page with 100 links would be a very large page and perhaps usability would be an issue.
Mack.
Design and Content Guidelines:
These are the only instances where I see Google suggesting that links on a page be kept under 100.
As pointed out in the discussion referenced above there are a number of web sites that have more than 100 links on a page. Checking some of the links near to or at the bottom of those pages shows that the links are indexed as backlinks for the pages they link to.
Is this suggestion for new web sites/pages? Googlebot seems limited on how deep it'll crawl on new websites. Do smaller pages with fewer links increase the chances of a deeper crawl?
- Keep the links on a given page to a reasonable number (fewer than 100). "
It is not a myth. Whether Google is acting on these guidelines is another question.
added: coconutz you beat me to it.
1) A page with more than 100 links isn't likely to be a good pages for users, and...
2) Google may not crawl more than 100 links on a page. (They aren't saying, but why take a chance?)
1) A page with more than 100 links isn't likely to be a good pages for users, and...
2) Google may not crawl more than 100 links on a page. (They aren't saying, but why take a chance?) "
===
Spot on.
Although...
One of my sites is listed in a reputable catalogue, in a single page containing about 300 links.
1) The links have anchors and so are easy to navigate
2) Google definitely crawls at least 180 links, coz that's where I am.
Nonetheless, Google's advice is sound - a monolithic dollop of link stuff is, in most cases, an unstructured monolithic dollop.
But this seems to be a style guide hint, not a raking hint.
Regards
DerekH
PS - europeforvisitors - let's keep europeforus - it's better that way <grin>
I would still keep the number of links on a page to not more than 100. (See msg 11 on that thread.)
I agree, as a general rule, but its clear that links beyond #100 are still spidered, and still count.
While the PR passed is undoubtedly diminished by being divided among so many links, it seems PR is still passed for such long links lists to count as PR4+ backlinks. OTOH, that PR4 backlink rule could be applied to linking page itself regardless of the level (if any) of PR being passed to the linked page, but I tend to believe that Google wouldn't show backlinks that had no value.
I agree, as a general rule, but its clear that links beyond #100 are still spidered, and still count.... but I tend to believe that Google wouldn't show backlinks that had no value.
Just because a link appears as a backlink, this doesn't mean that PR is passed. I think GoogleGuy confirmed this in the context of guestbook links. (Presumably BestBBs is another example for such a behaviour.)
While the PR passed is undoubtedly diminished by being divided among so many links ...
Of course, this effect was taken into account. (and it was already discussed in that thread.)
I'm pretty sure Google's guide relating to 100+ is for cosmetic reasons and at the very worse, googlebot screeches to a stop and reverses home at #100.
Has anyone any examples of a directory with over 100 links on one page that have been crawled?