eh.... I thought this was just a tool one could use to check page size and stuff?
Just caught the little devil picking up robots.txt and /
What gives?
Nick
onlineleben
8:56 am on Sep 24, 2002 (gmt 0)
Don't worry. Someone checked out your Homepage and the Netmechanic Spider checks robots.txt to determine if spidering is allowed. If you have a 'disallow' on your image files directory, Netmechanic will not spider this. The disadvantage is that you don't get correct downloadtimes returned from Netmechanic, as the images are not taken into account for the calculation.