Noone outside of Google will know an "exact number", and there probably isn't an exact number anyway. A mega-ranking page will always get brownie points for anything as far as Googlebot is concerned.
Ignore Google, build pages for your users. Whatever you would consider to be the maximum number of links your users would find useful is probably not far off Google's opinion either.
|too much information|
I don't know that it's a maximum number of links that Googlebot will crawl, more than a reasonable number of links on a page that Google is looking for.
If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure?
I have two pages (one per language, main entry points for large part of the content) with about 160 internal only links, they are doing very fine, all on the same subject with similar ontopic keywords as anchors.
Yeah,Build the content is the king.
There was a recent discussion of this, (an education for me).
Google will crawl 101 KB of every page. Google will find every link in the first 101 KB of a much larger page. If you can cram hundreds of links in a really clean coded 101 KB html page, then Google will find them all. The limit is determined by the page size rather than the number of links.
Stefans logic makes sense.
I just had a site map with 125 links on it and the links were crawled. The page was about 30k.
Just because something makes sense does not mean it is correct. I have never seen anywhere where anyone has PROVEN that Google stops indexing links at 101k.
My experience is that Google will go well past that limit.
I could have worded things more clearly. Maybe a better way to say it is that Google will find and use all the links in the first 101 KB of any given page. If you can squeeze hundreds of them in that first 101 KB, they're all found.
ADDED: Polarisman, I believe that the 101 KB limit is definite. There are posts from GG kicking around that indicate it to be true and it's stated somewhere on the Google site as well. I personally have pages larger than 101 KB that are clipped at that.
ADDED AGAIN: Ok, I found a GG post on that:
I thought the bot also had a redundancy factor that increased after 50 or so links, thus more likely to stop following the latter? 101kb is a definite though.
|If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure? |
That's a good question. IMHO a site map is a single resource that you use when you have problems navigating the site. Once you start splitting the site map up into sub-maps, you re-introduce navigation - and the purpose of the site map is lost.
HAs anyone got an example of a 100+ links page, pr4 minimum, where i can have a look whether google stops following after 101k, or after a certain number of links, or at least follows but with reduced PR?
feal free to sticky me
forget it, I have found one. 330 links, 48k, pr passed on to both external and internal links.
I think 100 links per page is not law, its just a guide for keeping your pages smaller
|More Traffic Please|
I recently had a similar question. I found Brett's answer interesting.
Yeah, I should have remembered that one...
So is the 101 KB limit a cache limit only? How large a page will Google crawl? I know that I have a 450 KB page that was a list of cave names, in alphabetical order, that was clipped right at 101 KB wrt serps and cache. I later chopped it into 6 smaller pages so that the entire list would appear in the serps, (which worked fine).
There were only links at the top of that 450 KB page, so I couldn't tell if Google had crawled right through looking for links.
ADDED: I just remembered... I saw the original 450 KB file, (left it up along with the chopped up ones), get crawled in the last few weeks. Have to dig into the logs and see how much the bot took.
See also: [webmasterworld.com...]