DumpedbyG - 11:09 am on Mar 18, 2005 (gmt 0) I quote from there website: Site Information: Traffic rankings, pictures of sites, links pointing to sites and more
incrediBILL you are wrong again!
"Alexa is continually crawling all publicly available web sites to create a series of snapshots of the Web. We use the data we collect to create features and services:
Related Links: Sites that are similar to the one you are currently viewing
Alexa has been crawling the Web since early 1996, and we have continually increased the amount of information that we gather. We are currently gathering approximately 1.6 Terabytes (1600 gigabytes) of Web content per day. After each snapshot of the Web, which takes approximately two months to complete, Alexa has gathered 4.5 Billion pages from over 16 million sites."
I quote from there website:
Site Information: Traffic rankings, pictures of sites, links pointing to sites and more