Forum Moderators: open

Message Too Old, No Replies

something wrong with slurp?

         

helenp

9:18 pm on Aug 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I donīt understand what this mean....
slurp donīt read the hole page?

This is from my stats this month:
Alexa spidered 305 pages, 16 robots.txt, used 5.00 MB
Google " 121 " 16 " " 2.14 MB
Slurp " 102 " 62 " " 471,89 KB
Jeeves " 73 " 17 " " 908,42 KB

Why slurp only used 471,89 KB of bandwith?
Even Jeeves used more bandwiht with less pages spidered.

diamondgrl

11:11 pm on Aug 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



are your pages all the same size? surely there might be some difference in page size that could cause this. for example, a pdf can easily skewer the results.

helenp

11:40 pm on Aug 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



all are html files........some is bigger than others, but the site is only about 280 pages, and the diference is huge.........

That would be if yahoo only spiders the smallest pages,
and they seems to spiders all kind of pages...

This is result for last month;
Googlebot (Google) 542+53 10.01 MB
Inktomi Slurp 188+66 848.76 KB
Jeeves 162+30 1.59 MB
Alexa (IA Archiver) 57+34 1.07 MB

Same here.............
Slurp just waste less bandwidth....

One moment, could be all spiders reads the atached .css files but slurp donīt?

sabai

1:45 pm on Aug 17, 2004 (gmt 0)

10+ Year Member



google indexes images, slurp doesn't

helenp

3:37 pm on Aug 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yea I know,
but itīs low anyway.
Alexa or looksmart donīt spider images either.

Alexa (IA Archiver) 309+19 5.06 MB 17 Aug 2004 - 03:38
Googlebot (Google) 248+23 4.47 MB 17 Aug 2004 - 10:12
Inktomi Slurp 134+82 639.30 KB 17 Aug 2004 - 09:02
WISENutbot (Looksmart) 210+3 2.58 MB

Dayo_UK

3:40 pm on Aug 17, 2004 (gmt 0)



google indexes images, slurp doesn't

Google also indexes pages and slurp doesn't ;)

It could be that slurp is fetching 404s - slurp has a tendancy to sometimes make up directories and files which naturally would result in a 404 and few kb transferred.

sabai

12:44 pm on Aug 18, 2004 (gmt 0)

10+ Year Member



slurp says that it accepts compression encoding in it's request header...

Accept-Encoding: gzip, x-gzip

If you've god mod_gzip or similar installed then maybe the your web server is compressing the files as they are transferred.