I have read in several places on this forum that Google only crawls 100k worth of information per page. I have several large pages thatare about 200k a piece, but without images would be a lot smaller. I was wondering if I could restrict GoogleBot from indexing images through the robots.txt that it could crawl more of the content. Also, some of the images are offsite, so would google count these images "against me" also?