Forum Moderators: open
This is from my stats this month:
Alexa spidered 305 pages, 16 robots.txt, used 5.00 MB
Google " 121 " 16 " " 2.14 MB
Slurp " 102 " 62 " " 471,89 KB
Jeeves " 73 " 17 " " 908,42 KB
Why slurp only used 471,89 KB of bandwith?
Even Jeeves used more bandwiht with less pages spidered.
That would be if yahoo only spiders the smallest pages,
and they seems to spiders all kind of pages...
This is result for last month;
Googlebot (Google) 542+53 10.01 MB
Inktomi Slurp 188+66 848.76 KB
Jeeves 162+30 1.59 MB
Alexa (IA Archiver) 57+34 1.07 MB
Same here.............
Slurp just waste less bandwidth....
One moment, could be all spiders reads the atached .css files but slurp donīt?
google indexes images, slurp doesn't
Google also indexes pages and slurp doesn't ;)
It could be that slurp is fetching 404s - slurp has a tendancy to sometimes make up directories and files which naturally would result in a 404 and few kb transferred.