Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: bakedjake
Stratton Sclavos, chief executive of VeriSign Inc., told investors in a conference call last month that the company might relaunch its "Site Finder" service as early as April.
"Site Finder was not controversial with users, 84 percent of whom said they liked it as a helpful navigation service," said Tom Galvin, VeriSign's vice president of government relations. "We continue to look at ways we can offer the service while addressing the concerns that were raised by a segment of the technical community."
It would be interesting to see exactly how they came to the conclusion that 84% of internet users like the "service" and find it useful...
I had a lot of users find me through sitefinder after misspelling my domain name. I'd assume most people got to sitefinder after misspelling a domain name they typed in.
When the Gigablast spider tries to download a page from a domain it first gets the associated robots.txt file for that domain. When the domain does not exist it ends up downloading a robots.txt file from verisign. There are two major problems with this. The first is that verisign's servers may be slow which will slow down Gigablast's indexing. Secondly, and this has been happening for a while now, Gigablast will still index any incoming link text for that domain, thinking that the domain still exists, but just that spider permission was denied by the robots.txt file.
From: [gigablast.com...] 30th Sept 2003
I don't know if this causes problems for the other SE's - and it would be possible to code around but why oh why should they have to.
No sympathy, the method of blocking spam that you're referring to hasn't worked in years. :)
but why oh why should they have to
(said with respect for the gigablast folks)
There's already a list of TLDs that's doing DNS wildcarding for non-existant domains. Are they not coding around this already?
I say bring it on as well. We saw a bit of traffic from it in the short period it was up.