The ruling in the Danish Newsbooster court case about deep linking could impact all the search engines and they way they collect links.
Even the most popular of search engines, Google, may be endangered. Last month, Google launched its free "Google News" service, which relies on precisely the same principle that got Newsbooster and others hauled into court.
funny how the concept of a robots.txt file isn't mentioned. if a crawler were to ignore a robots.txt file then fair play litigate against the SE but if a website isn't professional enough to use robots.txt then shame on them.
<By sending them straight to the stories, Newsbooster was, according to the argument that prevailed in court, both violating the papers' intellectual-property rights and depriving them of the ad revenue they would receive from making people navigate through multiple pages to retrieve the stories they wanted to read.>
yes good point incywincy...a robots.txt (that is adhered to) would prevent the above argument entirely...
Robots.txt wouldn't work because the papers want the story indexed so it shows in the SERPS, they just want the link to go to the homepage. It's a case of wanting the cake and getting it on your face, too.
I'd suspect that IF anything came of this in the country in question, it'd have to go through the same hoops in the U.S. and other countries. Then, IF, it got past all of that (or looked like it was going to), Google'd make an announcement stating that they're now supporting a new Meta Tag called "Deep-Link-To" with a URL as the content of the tag. That way, if you want the people to have to go through the BS of finding your page once they've already found it at a search engine, you can. It'd also be nice if Google made those results in the SERPS show some sort of unappealing icky brown. ;)