Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
They would probably have to double up on security and encryption, most likely meaning bandwidth and processing time and everything_else would take a little more time - just to make sure those that are processing the algo are not deprocessing it in the process ;) ;)
joined:Oct 23, 2002
Besides, the Canadian government just launched a project for distributed computing and they DID NOT use private computers as a part of the net for this exact reason.
It's not simple to check the results. Also, if you were to do something like that - you would have to run the same batch twice (at least) on different computers, selected randomly.
And if the results and not the same - you have to run it once again to see which one of the original two is correct.
That's 3 times the processing for one unit.
Basicaly, we are thinking of well regulated grid/costellation of dedicated computers run by independent companies, ISP, Registrars, or the like, no SETI-style volunteers.
The revenue, if any, may come from advertisement, but the truth is that to freed our customers from Google, and self promotion would be more than enough for many of us. Not to mention to stop paying those costly licences some pay to Fast/Google to enable portal user searches.
For a ISP like ourselves, hosting a few boxes will not be a significant cost at all, and there are thousands of companies like ours. ISPs usually host hundreds of boxes, some of them thousands, and general infrastructure and bandwidth cost to do such a thing is not really a issue. Many ISP commit resources for Linux distro downloading, MySQL downloading, and Tucows like networks. This will not be different.
Does anybody want to play?
[edited by: Marcos at 5:12 pm (utc) on Oct. 29, 2002]
That should not be a issue, unless you let anonymous volunteers and webmasters run the show, of course.
If you use well stabilised companies, say ISP, that is not likely to happen. ISP can easily be held accountable if they falsified search records or queries.
Besides, If they wanted to divert traffic, a rouge ISP could just tamper the DNS boxes they control, as rouge Registrars could tamper with the domain they register. But they don´t do it, because they will became immediately a legal target.
An Aspseek Grid Network, managed like the DNS hierarchy, would be more or less the same scenario.
[edited by: Marcos at 4:12 pm (utc) on Oct. 29, 2002]
Would those people you mentioned really give you a representative sample of the different subject areas of the web?
There were some other distributive computing search projects going on about a year ago, one by a university, but I do not know if they ever went anywhere.
Why not? It all depends of the crawler´s Algo. If the foundations of the project and the software used mach that goal, they will. If they don’t want to, they will just not join the project.