| 11:27 pm on Feb 9, 2004 (gmt 0)|
It would be interesting to see exactly how they came to the conclusion that 84% of internet users like the "service" and find it useful...
| 11:41 pm on Feb 9, 2004 (gmt 0)|
|It would be interesting to see exactly how they came to the conclusion that 84% of internet users like the "service" and find it useful... |
I had a lot of users find me through sitefinder after misspelling my domain name. I'd assume most people got to sitefinder after misspelling a domain name they typed in.
| 11:49 pm on Feb 9, 2004 (gmt 0)|
They use the service to sell ads don't they?
| 4:24 am on Feb 10, 2004 (gmt 0)|
'twas overture last time.
| 9:40 am on Feb 10, 2004 (gmt 0)|
Now I'll have to be re-directed to that advert-ified page again if I do a typo while going to a URL. Its a real pain in my case.
I'm not freak'n allowing any company to customize my surfing experience!
| 9:47 am on Feb 10, 2004 (gmt 0)|
It's not just a problem for surfers and being exposed to ad's and spam blockers, SE's are affected. Matt Wells of Gigablast expands:
When the Gigablast spider tries to download a page from a domain it first gets the associated robots.txt file for that domain. When the domain does not exist it ends up downloading a robots.txt file from verisign. There are two major problems with this. The first is that verisign's servers may be slow which will slow down Gigablast's indexing. Secondly, and this has been happening for a while now, Gigablast will still index any incoming link text for that domain, thinking that the domain still exists, but just that spider permission was denied by the robots.txt file.
From: [gigablast.com...] 30th Sept 2003
I don't know if this causes problems for the other SE's - and it would be possible to code around but why oh why should they have to.
| 10:10 am on Feb 10, 2004 (gmt 0)|
Bring it on. Had lots of additional free traffic from them last time...
| 5:11 pm on Feb 10, 2004 (gmt 0)|
No sympathy, the method of blocking spam that you're referring to hasn't worked in years. :)
|but why oh why should they have to |
(said with respect for the gigablast folks)
There's already a list of TLDs that's doing DNS wildcarding for non-existant domains. Are they not coding around this already?
I say bring it on as well. We saw a bit of traffic from it in the short period it was up.
| 1:31 pm on Feb 11, 2004 (gmt 0)|
I created an online petition (can't seem to post URL -- search Google for it!) the last time they pulled this stunt, and if you read the 18,000+ comments, it's clear VeriSign is not credible when they say 84% support SiteFinder.