Forum Moderators: Robert Charlton & goodroi
That said, I am also working with that client to put their code on a diet and reduce the file sizes by 40% or more.
< edit reason: I originally typed "more than
10 links", but I meant "more than 100 links." >
[edited by: tedster at 7:29 pm (utc) on Dec. 22, 2006]
The 160th link at around 130k was spidered.
So google got the link, even though it was past the limit that would cache. They also followed the link even though it was well past the 100th link.
Out of the 2 possibilities, I would say that an excessively large number of links is likely to be more of a problem than file size, unless google has drastically redesigned how they feed links to the spider.
Another option is that they might be malformed links that the browsers are able to figure out correctly, but it screws up googlebot.