Stratton Sclavos, chief executive of VeriSign Inc., told investors in a conference call last month that the company might relaunch its "Site Finder" service as early as April.
"Site Finder was not controversial with users, 84 percent of whom said they liked it as a helpful navigation service," said Tom Galvin, VeriSign's vice president of government relations. "We continue to look at ways we can offer the service while addressing the concerns that were raised by a segment of the technical community."
It's not just a problem for surfers and being exposed to ad's and spam blockers, SE's are affected. Matt Wells of Gigablast expands:
When the Gigablast spider tries to download a page from a domain it first gets the associated robots.txt file for that domain. When the domain does not exist it ends up downloading a robots.txt file from verisign. There are two major problems with this. The first is that verisign's servers may be slow which will slow down Gigablast's indexing. Secondly, and this has been happening for a while now, Gigablast will still index any incoming link text for that domain, thinking that the domain still exists, but just that spider permission was denied by the robots.txt file.
I created an online petition (can't seem to post URL -- search Google for it!) the last time they pulled this stunt, and if you read the 18,000+ comments, it's clear VeriSign is not credible when they say 84% support SiteFinder.