homepage Welcome to WebmasterWorld Guest from 54.197.215.146
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
How Many Links Will Googlebot Crawl on a Page?
mikeshen




msg:163863
 4:23 am on Jan 12, 2004 (gmt 0)

Any one can suggestion how many links is limited if all the links crawled by Googlebot and index by google?
Some one said 100 is the most,I don't know if there are some accurate Number.
Tks

 

dmorison




msg:163864
 8:01 pm on Jan 12, 2004 (gmt 0)

Noone outside of Google will know an "exact number", and there probably isn't an exact number anyway. A mega-ranking page will always get brownie points for anything as far as Googlebot is concerned.

Ignore Google, build pages for your users. Whatever you would consider to be the maximum number of links your users would find useful is probably not far off Google's opinion either.

too much information




msg:163865
 8:12 pm on Jan 12, 2004 (gmt 0)

I don't know that it's a maximum number of links that Googlebot will crawl, more than a reasonable number of links on a page that Google is looking for.

If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure?

bull




msg:163866
 10:39 pm on Jan 12, 2004 (gmt 0)

I have two pages (one per language, main entry points for large part of the content) with about 160 internal only links, they are doing very fine, all on the same subject with similar ontopic keywords as anchors.

mikeshen




msg:163867
 2:02 am on Jan 13, 2004 (gmt 0)

Yeah,Build the content is the king.
Tks all.

Stefan




msg:163868
 2:24 am on Jan 13, 2004 (gmt 0)

There was a recent discussion of this, (an education for me).

Google will crawl 101 KB of every page. Google will find every link in the first 101 KB of a much larger page. If you can cram hundreds of links in a really clean coded 101 KB html page, then Google will find them all. The limit is determined by the page size rather than the number of links.

MrSpeed




msg:163869
 3:17 am on Jan 13, 2004 (gmt 0)

Stefans logic makes sense.

I just had a site map with 125 links on it and the links were crawled. The page was about 30k.

Polarisman




msg:163870
 4:18 am on Jan 13, 2004 (gmt 0)

Just because something makes sense does not mean it is correct. I have never seen anywhere where anyone has PROVEN that Google stops indexing links at 101k.

My experience is that Google will go well past that limit.

Stefan




msg:163871
 4:20 am on Jan 13, 2004 (gmt 0)

Respect, MrSpeed.

I could have worded things more clearly. Maybe a better way to say it is that Google will find and use all the links in the first 101 KB of any given page. If you can squeeze hundreds of them in that first 101 KB, they're all found.

ADDED: Polarisman, I believe that the 101 KB limit is definite. There are posts from GG kicking around that indicate it to be true and it's stated somewhere on the Google site as well. I personally have pages larger than 101 KB that are clipped at that.

ADDED AGAIN: Ok, I found a GG post on that:

[webmasterworld.com...]

MSG #11

trimmer80




msg:163872
 5:23 am on Jan 13, 2004 (gmt 0)

I thought the bot also had a redundancy factor that increased after 50 or so links, thus more likely to stop following the latter? 101kb is a definite though.

Hagstrom




msg:163873
 9:53 am on Jan 13, 2004 (gmt 0)

If you site has 700 or more pages, does it really make sense to put all 700 links on a one page site map? Or would it be easier to navigate with a directory structure?

That's a good question. IMHO a site map is a single resource that you use when you have problems navigating the site. Once you start splitting the site map up into sub-maps, you re-introduce navigation - and the purpose of the site map is lost.

nippi




msg:163874
 12:45 pm on Jan 13, 2004 (gmt 0)

HAs anyone got an example of a 100+ links page, pr4 minimum, where i can have a look whether google stops following after 101k, or after a certain number of links, or at least follows but with reduced PR?

feal free to sticky me

nippi




msg:163875
 9:31 pm on Jan 13, 2004 (gmt 0)

forget it, I have found one. 330 links, 48k, pr passed on to both external and internal links.

I think 100 links per page is not law, its just a guide for keeping your pages smaller

More Traffic Please




msg:163876
 9:57 pm on Jan 13, 2004 (gmt 0)

I recently had a similar question. I found Brett's answer interesting.

[webmasterworld.com...]

Stefan




msg:163877
 10:07 pm on Jan 13, 2004 (gmt 0)

Yeah, I should have remembered that one...

So is the 101 KB limit a cache limit only? How large a page will Google crawl? I know that I have a 450 KB page that was a list of cave names, in alphabetical order, that was clipped right at 101 KB wrt serps and cache. I later chopped it into 6 smaller pages so that the entire list would appear in the serps, (which worked fine).

There were only links at the top of that 450 KB page, so I couldn't tell if Google had crawled right through looking for links.

ADDED: I just remembered... I saw the original 450 KB file, (left it up along with the chopped up ones), get crawled in the last few weeks. Have to dig into the logs and see how much the bot took.

g1smd




msg:163878
 10:57 pm on Jan 13, 2004 (gmt 0)

See also: [webmasterworld.com...]

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved