Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: martinibuster
The newspapers agued that using robots.txt is like fighting a hopeless war game. If they start using robot.txt files, then robots will just get around it and so the newspapers would have to use stronger forces to keep them out, and so the robots would fight even harder back. A technological war.
We all know that is not the truth. Robots.txt is – even though it’s not an official standard – widely respected by all serious players on the Internet including all the search engines, and NewsBooster.
The problem is that the lawyer NewsBooster had was not as sharp as the ones the newspapers had (with the more money the used) so she never managed to prove that the augment was wrong. So the judge could not do anything but accept that robots.txt is not an acceptable solution – but just a start of a technological nightmare that there would be no legal reasons to ask the newspapers to enter.
There was a lot of other stupid things said, and a few outright lies that was never proven to be wrong. If a lie stand unproven to be wrong all the judge can do is to accept it.
Trust me, I was almost SCREAMING in the court room when I saw the newspapers get away with this several times during the day.
Another solution is to place a cookie on every machine when the home page is hit. That cookie can get checked on every hit to the site. If the cookie isn't there or up to date, it feeds down a error page and does a 302 redirect to the home page. This effectively kills all deep links to the site. It's simple to implement. I could do it to my sites in maybe an hour of programming.
I'm getting ticked by companies suing over something that could be fixed easily with a little work.
joined:Apr 13, 2001
For one thing, even though mega-media is increasingly centralized, and a threat to our access to information because they spew propaganda more often than not, the search engines are also super-centralized, and some have very little respect for privacy. Engines such as Google collect everything they can grab, and never erase their data. Google knows much more about me than the New York Times does (even though I once had my picture in the NYT).
The entire robots.txt situation is completely backwards. It ought to be an opt-in situation instead of an opt-out situation. The Google cache should definitely be an opt-in situation.
I feel that the search engines have made an entire range of assumptions about what they can get away with, so that court decisions like this are bound to start cropping up in the U.S. There's a big Ninth Circuit case pending (the oral arguments have already been heard, and a decision is due soon) about sites that take content such as newspaper articles from newspaper sites, and reformat it and put it on their weblog for local access. The Free Republic will lose this case, I predict. This decision from the Ninth Circuit could have an immediate effect on Google's caching policy, as well as on a number of liberal sites that do the same thing.
The recent decision from the Ninth Circuit about framing images was another nail in the coffin of indiscriminate caching. While "lifting" content is admittedly more serious than "deep linking" (for one thing, the owner of copy that was lifted loses control over any technical remedies that may address his objections), some of the arguments remain the same (appropriation of copyrighted content for one's own profit on a massive scale, sidestepping the owner's ad click-throughs, etc.).
The arrogance of the search engines is to blame as much as the arrogance of the media giants.
There are more than a few ways to look at the situations. The two extremes are:
1. All copywrited content is copywrited. End of the story. Any third party, big or small, national or international, should ask rights to use this information, whether it is for profit or not. Search Engines included.
2. By publishing content on internet, a new 'open' medium, the author gives third parties the right to use this information if the copywrite rights are respected and the source is clearly indicated.
One can make a few comments here.
1. It is clear that the second statement is clearly under very heavy fire from a growing number of corporations, institutions, and governments. The fact that search engines use this second approach when it suits them, and use the first one when it protects them, is clearly a big threat for the future of search engines, if they continue to work like they do now. We will see a wave of similar trials in a very near future. The biggest ones when the time is ripe.
2. The problem is indeed extremely complex, and will become even more complex when internet will be the nervous system of our entreprises, governments, mobile phones, kitchens, cars, and so on. This is just the beginning. Just an example: what would you think of a 'search engine' crawling the internet networked GPS system of your car or mobile phone, to let everybody who searches for you exactly know where you are driving your car or using your GSM right now? Minute by minute. Just one example of many. :)
That would cut off all traffic from those search engines. People won't be viewing a lot of ads that way either. It seems that the robots.txt would be an acceptable solution, except that then the newspapers would not gain the benefit of all those content pages. It really sounds like the newspapers want to have it all their own way.
In the long run the users will win, I am sure. Just like they did many times before, when new useful technology has been introduced.
The music business was using the exact same arguments about the radio when that was introduced as the newspapers do now: Unless we stop this unrestricted and free use of our copyrighted material, we will not be able to stay in business. The fact today is that most record companies would kill to get their latest release on A-Rotation on the major radio stations. The more time the song is played for free – the more copies they sell. They know that today.
joined:July 19, 2001
I am worried by the ruling, but not overly so. I think that in the end the policy pursued by the newspapers in this case will lead to their isolation from the new media mainstream.
I can see that organisations are not going to want to link to a home page where they do not know what the content will be from day to day so they will not link to the sites at all, which will starve the sites of traffic. Starved of traffic they will also be starved of advertising revenue and will dwindle.
I haven't read it yet in English (just the Danish one) but the translation is done by an agency that translates legal stuff, so it should be ok.
The real interesting part (you can skip the other parts if you like) is the last part of Section 4. This is where all the crazy arguments is to be found! :)
I would like to comment a few things:
1. It is agued that NewsBooster is publishing the articles. The fact is that all they do is link to them.- NewsBooster do not even show META-description or an extract from the web pages (as almost any other SE would). But for now, the ruling actually states that a link to a web page is the same as publishing the copyrighted material. We don’t want that conclusion to spread around in the international legal systems, right! :)
2. It is interesting to note the statement from Lasse Bolander (Diretor in a Company that owns a few of the newspapers) were he claim that the more visitors they get directly to relevant pages within their sites the more money they loose. He actually testified in court that every new visitor to B.T. (one of the newspapers) directly lead to higher losses, so he do not want anyone to market their website. What a great business model! LOL :)
3. I love the description of what it is NewsBoosters activity is (in legal language):
“Consequently, Newsbooster’s search engine - and therefore not the users - needs to crawl the websites of the Internet media frequently for the purpose of registering headlines and establishing deep links in accordance with the search criteria defined by the users.”
"As a result, Newsbooster repeatedly and systematically reproduces and publishes the Principals’ headlines and articles."
The funny thing is that NewsBooster only crawl every 30-60 minutes which is slower than e.g. FAST. So according to the interpretation of the law in this case FAST must be violating it even more – being more “repeatedly” and probably also “systematically”
4. A very important part of the case has been the fact that NewsBooster is a commercial business trying to make money from the search service they run. That almost the worst thing you can do in Denmark: Trying to make money. That in itself is almost a criminal activity around here :)
"Newsbooster has a commercial interest in this business."
What search engine on the web today is not trying to make a profit?
Just got a the copy of the ofically translated ruling
Just a small comment regarding the ruling. The translation says Bailiff’s Court. I believe this is the official translation, but if I'm not wrong another related English term is The Court of Enforcement.
The ruling is an injunction, but if I've got the "small print" correct a ruling from the Bailiff's Court in Denmark is an interim measure. The plaintiff must follow up with a proper lawsuit within two weeks (unless, I guess, the case is settled out of court). (Disclaimer: IANAL).
In other words, the press coverage seems to be somewhat larger than life. But then, it's the mass media silly season, isn't it? (In Danish the silly season is called the cucumber season.)
The common practise, however, is that a ruling from the Bailiff’s Court is confirmed by the "real" court - in either the form of a "justification case" (not sure if that makes any sense in English!) or a full legal case.
The "game" now is to make sure that the same mistakes that was made, and the same wrong things that was said in the Bailiff’s Court is not repeated in the next steps of the case.
That's one of the reasons the coverage and discussions around the world are so important. It is important to get people to understand what it is the newspapers want (not saying that they will get it their way in the end). What they want is not the kind of Internet I want and I am ready to argue for what I believe is right :)
Thanks for the first hand report from the court room though Mikkel.
I totally agree that was a very sad day in Danish Internet history.
From the very beginning of this case, I have felt that the rest of the World must think that we complete morrons in Denmark. It's unbelievable that this case try to force the very nature (linking) of the web away.
The statement from the newspaper B.T. basically says it all - they don't want people at their online site because they lose money. Gimme a break!
Here you have a site that litterally sends hundreds of users to your articles every day and you don't want that? I bet the advertisers on the newspapers sites love it?
Well, I could go on but it would probably just be one large rant.
There's something rotten in the state of Denmark!
...and it's not the famous pastry.
<-- Going back to beach, which is one of the nice Danish things left (yet)
While the current debate may revolve around issues about "using other's content", the real overiding debate is over the ability of the Web to allow anybody to publish and distribute information - a right previously "owned" by the traditional print and broadcast publishers - on the basis of high entry costs and horizontal monopolies. That was why the AOL/Time Warner merger was so significant as a startegy for the latter to maintain their power as publishers in a new medium.
The publishing industry first of all pooh-pooh'ed and failed to see the threat the Internet posed for their industrial oligarchy (a change in the way info is provided that can potentially cut out the middle man - the middle man being the whole raison detre of publishing)
After a while they did start to see it - their response was to jump into the Web and try to monopolise it - when that started to be more difficult than expected (though not impossible in the long run) they start to want to change the web and medium itself using their old legal partners. To change it from a protocol which is based on the humble hyperlink to one which can be controlled and become just a repository of different origanization's paid for "spaces". That is a world they can control - its their familiar sandpit - the sharing hyperlinked basis of the Web however, is one they cannot control.
To me part of this is positive. The Web has not yet succumbed to the power of the current information elite.
It seems to be that the key legal point is ".. a violation of the Danish Copyright Act and the Danish Marketing Act, which forbids profiting by use of other companies' products and/or services..."
That does seem a curious law however. Im thinking of what changes that would mean, off and on line, if it was law in other countries.
joined:Dec 9, 2001
joined:July 19, 2001
This case signals the dawning on part of the media industry that they [big]NEED[/big] to make money from their websites and they are now trying to put in place protections on that web income.
You can already see the big players - the real newpaper barons moving into the web and digital and satellite TV in a big way.
Newbooster is in itself probably an endagered species, as it really is only using the traditional media which is in general horizontal in its approach to information and applying filters to allow each user to create their own vertical news portal. In the longer term you will probably see the providers turning to specialist vertical news portals to provide their customers with the news feeds that they want.
Probably doesn't help you much, as the deeplink injunction only refers to §1 and then you need to know the legal definition of "good marketing practices" in Denmark.
The funny thing is, that it is the same paragraph that is being used in a situation involving the Danish newspapers as the "Tip a Friend" function that many sites have - including the newspapers sites, seems to be iilegal according to the same paragraph in the law.
Seeing that banning links is the metaphorical equvilent of moving your office from the high street and placing it in a disused field, the switched-on, linked-up web community (yes,us!) stand to inherit masses of ex-megacorp traffic. Hurrah.
The ruling is the latest legal decision in a two-year battle between German newspaper Mainpost and German search service NewsClub. Mainpost charges that NewsClub violated the law by searching through and linking directly to Mainpost content.
The interesting thing about this case, is that the judge used general Europe law, and not local country legislation:
Recently, linking directly to other websites' content has increasingly come under legal fire in Europe, but most of the cases have centered on a specific country's law. The NewsClub case is based on a law common to the entire European Union.
Well, well, you need to be carefull who you link to - at least in Denmark and Germany.
What makes the German case even worse is that the decision was made on a much higher level in the legal system than the NewsBooster case - so far. I am very sure that the Danish newspapers will use the German case in the continuing fight here.
I am not a lawyer and I suppose the judges know what they are doing (I hope so), so if they interpret the laws we have now in the way they do it’s probably ok, but then the laws needs to be changed.
I had the pleasure of talking with the IT-spokesman for our governing party in Denmark. He completely agrees that search and almost any kind of linking should be legal – and if it’s not now, he will do what he can to change the laws. He will meet up with the other IT-responsible politicians from the other parties right after the summer.
Maybe this is not a fight we should expect to win in court. Maybe we need to have the laws changed and in that case we better start making some good friends within the governments in our countries to make sure they understand how serious this is. All politicians in most parts of the world today want to focus on IT. They know how important the Internet has become and they want their share of the IT-future. They cannot afford not to focus on IT and the Internet.
It is our job to teach the politicians about the importance of free linking and good search.
Agreed. If they can't figure it out themselves, we have to point them in the right direction.
hystri, I agree. One link wont harm you, but what about say 5, 10 or maybe 100? Would that constitute colletion of links and thereby breaking the law.
It would be very close in the danish interpretation of the law..