chewy - 10:37 am on Jun 30, 2012 (gmt 0)
Maybe there will be a new form of xml file - like "linkmap.xml" where you list good and bad links and hook it up to webmaster tools.
It would have to be secure at some level (probably the "VERY" secure level), and security by obscurity be good enough, or better yet, invent a new form of htaccess file that lists bad links and good links.
Would it be all that bad if this file was semi-public in that webmasters in the know could look at the file and learn good / bad links, and copy it for their own purposes?
There are a couple of ways to do this - one might be to "open source it" so everyone who mattered could learn, copy, reuse. Or make it super private.
Either way, a whole new series of businesses would be born.
Is this (shouldn't this be) being discussed at an an IETF (Internet Engineering Task Force) or IEEE level by any chance?
>Click here for the newest list of bad links that you should be blocking! :-)