Welcome to WebmasterWorld Guest from 54.145.209.34

Forum Moderators: open

How Many Links Will Googlebot Crawl on a Page?

   
4:23 am on Jan 12, 2004 (gmt 0)

10+ Year Member



Any one can suggestion how many links is limited if all the links crawled by Googlebot and index by google?
Some one said 100 is the most,I don't know if there are some accurate Number.
Tks
8:01 pm on Jan 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Noone outside of Google will know an "exact number", and there probably isn't an exact number anyway. A mega-ranking page will always get brownie points for anything as far as Googlebot is concerned.

Ignore Google, build pages for your users. Whatever you would consider to be the maximum number of links your users would find useful is probably not far off Google's opinion either.

8:12 pm on Jan 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't know that it's a maximum number of links that Googlebot will crawl, more than a reasonable number of links on a page that Google is looking for.

If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure?

10:39 pm on Jan 12, 2004 (gmt 0)

10+ Year Member



I have two pages (one per language, main entry points for large part of the content) with about 160 internal only links, they are doing very fine, all on the same subject with similar ontopic keywords as anchors.
2:02 am on Jan 13, 2004 (gmt 0)

10+ Year Member



Yeah,Build the content is the king.
Tks all.
2:24 am on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There was a recent discussion of this, (an education for me).

Google will crawl 101 KB of every page. Google will find every link in the first 101 KB of a much larger page. If you can cram hundreds of links in a really clean coded 101 KB html page, then Google will find them all. The limit is determined by the page size rather than the number of links.

3:17 am on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Stefans logic makes sense.

I just had a site map with 125 links on it and the links were crawled. The page was about 30k.

4:18 am on Jan 13, 2004 (gmt 0)

10+ Year Member



Just because something makes sense does not mean it is correct. I have never seen anywhere where anyone has PROVEN that Google stops indexing links at 101k.

My experience is that Google will go well past that limit.

4:20 am on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Respect, MrSpeed.

I could have worded things more clearly. Maybe a better way to say it is that Google will find and use all the links in the first 101 KB of any given page. If you can squeeze hundreds of them in that first 101 KB, they're all found.

ADDED: Polarisman, I believe that the 101 KB limit is definite. There are posts from GG kicking around that indicate it to be true and it's stated somewhere on the Google site as well. I personally have pages larger than 101 KB that are clipped at that.

ADDED AGAIN: Ok, I found a GG post on that:

[webmasterworld.com...]

MSG #11

5:23 am on Jan 13, 2004 (gmt 0)

10+ Year Member



I thought the bot also had a redundancy factor that increased after 50 or so links, thus more likely to stop following the latter? 101kb is a definite though.
9:53 am on Jan 13, 2004 (gmt 0)

10+ Year Member



If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure?

That's a good question. IMHO a site map is a single resource that you use when you have problems navigating the site. Once you start splitting the site map up into sub-maps, you re-introduce navigation - and the purpose of the site map is lost.

12:45 pm on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



HAs anyone got an example of a 100+ links page, pr4 minimum, where i can have a look whether google stops following after 101k, or after a certain number of links, or at least follows but with reduced PR?

feal free to sticky me

9:31 pm on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



forget it, I have found one. 330 links, 48k, pr passed on to both external and internal links.

I think 100 links per page is not law, its just a guide for keeping your pages smaller

9:57 pm on Jan 13, 2004 (gmt 0)

10+ Year Member



I recently had a similar question. I found Brett's answer interesting.

[webmasterworld.com...]

10:07 pm on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, I should have remembered that one...

So is the 101 KB limit a cache limit only? How large a page will Google crawl? I know that I have a 450 KB page that was a list of cave names, in alphabetical order, that was clipped right at 101 KB wrt serps and cache. I later chopped it into 6 smaller pages so that the entire list would appear in the serps, (which worked fine).

There were only links at the top of that 450 KB page, so I couldn't tell if Google had crawled right through looking for links.

ADDED: I just remembered... I saw the original 450 KB file, (left it up along with the chopped up ones), get crawled in the last few weeks. Have to dig into the logs and see how much the bot took.

10:57 pm on Jan 13, 2004 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



See also: [webmasterworld.com...]
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month