Welcome to WebmasterWorld Guest from

Forum Moderators: martinibuster

Message Too Old, No Replies

Ruling Handed Down in Danish Deep Linking Case



3:12 pm on Jul 5, 2002 (gmt 0)

10+ Year Member

"Newsbooster.com barred from linking to Danish newspaper Web sites"
AP Story [story.news.yahoo.com]

Comments from Newsbooster (site involved in law suit)
[newsbooster.com ]

Background Here
[wired.com ]

Mikkel Svendsen

8:11 pm on Jul 5, 2002 (gmt 0)

10+ Year Member

> what did the judge say about the robots.txt issue ?
That part is interesting!

The newspapers agued that using robots.txt is like fighting a hopeless war game. If they start using robot.txt files, then robots will just get around it and so the newspapers would have to use stronger forces to keep them out, and so the robots would fight even harder back. A technological war.

We all know that is not the truth. Robots.txt is – even though it’s not an official standard – widely respected by all serious players on the Internet including all the search engines, and NewsBooster.

The problem is that the lawyer NewsBooster had was not as sharp as the ones the newspapers had (with the more money the used) so she never managed to prove that the augment was wrong. So the judge could not do anything but accept that robots.txt is not an acceptable solution – but just a start of a technological nightmare that there would be no legal reasons to ask the newspapers to enter.

There was a lot of other stupid things said, and a few outright lies that was never proven to be wrong. If a lie stand unproven to be wrong all the judge can do is to accept it.

Trust me, I was almost SCREAMING in the court room when I saw the newspapers get away with this several times during the day.


10:39 pm on Jul 5, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

This is a serious problem: going to court over something what could be solved with technology. Robots.txt is one solution.

Another solution is to place a cookie on every machine when the home page is hit. That cookie can get checked on every hit to the site. If the cookie isn't there or up to date, it feeds down a error page and does a 302 redirect to the home page. This effectively kills all deep links to the site. It's simple to implement. I could do it to my sites in maybe an hour of programming.

I'm getting ticked by companies suing over something that could be fixed easily with a little work.


11:32 pm on Jul 5, 2002 (gmt 0)

This is a war -- between old tech and new tech. The legal systems of the world are sensitive to political power. It's quite possible that the search engines will be the losers in the end. I'm not completely convinced that this would be bad, even though I have zero sympathy for mega-media.

For one thing, even though mega-media is increasingly centralized, and a threat to our access to information because they spew propaganda more often than not, the search engines are also super-centralized, and some have very little respect for privacy. Engines such as Google collect everything they can grab, and never erase their data. Google knows much more about me than the New York Times does (even though I once had my picture in the NYT).

The entire robots.txt situation is completely backwards. It ought to be an opt-in situation instead of an opt-out situation. The Google cache should definitely be an opt-in situation.

I feel that the search engines have made an entire range of assumptions about what they can get away with, so that court decisions like this are bound to start cropping up in the U.S. There's a big Ninth Circuit case pending (the oral arguments have already been heard, and a decision is due soon) about sites that take content such as newspaper articles from newspaper sites, and reformat it and put it on their weblog for local access. The Free Republic will lose this case, I predict. This decision from the Ninth Circuit could have an immediate effect on Google's caching policy, as well as on a number of liberal sites that do the same thing.

The recent decision from the Ninth Circuit about framing images was another nail in the coffin of indiscriminate caching. While "lifting" content is admittedly more serious than "deep linking" (for one thing, the owner of copy that was lifted loses control over any technical remedies that may address his objections), some of the arguments remain the same (appropriation of copyrighted content for one's own profit on a massive scale, sidestepping the owner's ad click-throughs, etc.).

The arrogance of the search engines is to blame as much as the arrogance of the media giants.


12:07 am on Jul 6, 2002 (gmt 0)

10+ Year Member

I have followed closely more than a few cases similar to this one, and commented in antoher thread at WMW on the behavior of the Internet Gozilla's asking for trouble.

There are more than a few ways to look at the situations. The two extremes are:

1. All copywrited content is copywrited. End of the story. Any third party, big or small, national or international, should ask rights to use this information, whether it is for profit or not. Search Engines included.

2. By publishing content on internet, a new 'open' medium, the author gives third parties the right to use this information if the copywrite rights are respected and the source is clearly indicated.

One can make a few comments here.

1. It is clear that the second statement is clearly under very heavy fire from a growing number of corporations, institutions, and governments. The fact that search engines use this second approach when it suits them, and use the first one when it protects them, is clearly a big threat for the future of search engines, if they continue to work like they do now. We will see a wave of similar trials in a very near future. The biggest ones when the time is ripe.

2. The problem is indeed extremely complex, and will become even more complex when internet will be the nervous system of our entreprises, governments, mobile phones, kitchens, cars, and so on. This is just the beginning. Just an example: what would you think of a 'search engine' crawling the internet networked GPS system of your car or mobile phone, to let everybody who searches for you exactly know where you are driving your car or using your GSM right now? Minute by minute. Just one example of many. :)

Very interesting...


7:28 pm on Jul 6, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Of course if the newspapers try sueing the search engines I am sure the newspapers' entire site, including their index pages, would get delisted from the search engine index rather quickly. They do not have to list.

That would cut off all traffic from those search engines. People won't be viewing a lot of ads that way either. It seems that the robots.txt would be an acceptable solution, except that then the newspapers would not gain the benefit of all those content pages. It really sounds like the newspapers want to have it all their own way.

Mikkel Svendsen

11:18 pm on Jul 6, 2002 (gmt 0)

10+ Year Member

Yes, they key issue in this case is that the newspapers do not understand this new media at all. They enter a ball game and demand that everyone else change the rules in the game. It’s like joining a football club and then ask the police to issue a statement (I am not sure what you call that kind in English) that limit the other players from getting any closer than 100 yard to you. It might be possible to find some loophole in the law that would make it possible (remember, War of the Roses!) but it would still be a pretty stupid thing to do if you want to play the game.

In the long run the users will win, I am sure. Just like they did many times before, when new useful technology has been introduced.

The music business was using the exact same arguments about the radio when that was introduced as the newspapers do now: Unless we stop this unrestricted and free use of our copyrighted material, we will not be able to stay in business. The fact today is that most record companies would kill to get their latest release on A-Rotation on the major radio stations. The more time the song is played for free – the more copies they sell. They know that today.


8:54 pm on Jul 7, 2002 (gmt 0)

WebmasterWorld Administrator ianturner is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Many thanks Mikkel for the insights you have provided into this case.

I am worried by the ruling, but not overly so. I think that in the end the policy pursued by the newspapers in this case will lead to their isolation from the new media mainstream.

I can see that organisations are not going to want to link to a home page where they do not know what the content will be from day to day so they will not link to the sites at all, which will starve the sites of traffic. Starved of traffic they will also be starved of advertising revenue and will dwindle.

Mikkel Svendsen

11:25 am on Jul 8, 2002 (gmt 0)

10+ Year Member

Just got a the copy of the ofically translated ruling:

I haven't read it yet in English (just the Danish one) but the translation is done by an agency that translates legal stuff, so it should be ok.

Mikkel Svendsen

12:34 pm on Jul 8, 2002 (gmt 0)

10+ Year Member

Now I got the time to read the English translation.

The real interesting part (you can skip the other parts if you like) is the last part of Section 4. This is where all the crazy arguments is to be found! :)

I would like to comment a few things:

1. It is agued that NewsBooster is publishing the articles. The fact is that all they do is link to them.- NewsBooster do not even show META-description or an extract from the web pages (as almost any other SE would). But for now, the ruling actually states that a link to a web page is the same as publishing the copyrighted material. We don’t want that conclusion to spread around in the international legal systems, right! :)

2. It is interesting to note the statement from Lasse Bolander (Diretor in a Company that owns a few of the newspapers) were he claim that the more visitors they get directly to relevant pages within their sites the more money they loose. He actually testified in court that every new visitor to B.T. (one of the newspapers) directly lead to higher losses, so he do not want anyone to market their website. What a great business model! LOL :)

3. I love the description of what it is NewsBoosters activity is (in legal language):

“Consequently, Newsbooster’s search engine - and therefore not the users - needs to crawl the websites of the Internet media frequently for the purpose of registering headlines and establishing deep links in accordance with the search criteria defined by the users.”
"As a result, Newsbooster repeatedly and systematically reproduces and publishes the Principals’ headlines and articles."

The funny thing is that NewsBooster only crawl every 30-60 minutes which is slower than e.g. FAST. So according to the interpretation of the law in this case FAST must be violating it even more – being more “repeatedly” and probably also “systematically”

4. A very important part of the case has been the fact that NewsBooster is a commercial business trying to make money from the search service they run. That almost the worst thing you can do in Denmark: Trying to make money. That in itself is almost a criminal activity around here :)

"Newsbooster has a commercial interest in this business."

What search engine on the web today is not trying to make a profit?


12:52 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member nick_w is a WebmasterWorld Top Contributor of All Time 10+ Year Member

That almost the worst thing you can do in Denmark: Trying to make money. That in itself is almost a criminal activity around here

Hahaha, he's not even joking folks, it's true! Criminal, anti-social and very much frowned upon...



1:39 pm on Jul 8, 2002 (gmt 0)

10+ Year Member

Just got a the copy of the ofically translated ruling

Just a small comment regarding the ruling. The translation says Bailiff’s Court. I believe this is the official translation, but if I'm not wrong another related English term is The Court of Enforcement.

The ruling is an injunction, but if I've got the "small print" correct a ruling from the Bailiff's Court in Denmark is an interim measure. The plaintiff must follow up with a proper lawsuit within two weeks (unless, I guess, the case is settled out of court). (Disclaimer: IANAL).

In other words, the press coverage seems to be somewhat larger than life. But then, it's the mass media silly season, isn't it? (In Danish the silly season is called the cucumber season.)

Mikkel Svendsen

1:51 pm on Jul 8, 2002 (gmt 0)

10+ Year Member

Yes, it is not the final of this case - so there is still some hope :)

The common practise, however, is that a ruling from the Bailiff’s Court is confirmed by the "real" court - in either the form of a "justification case" (not sure if that makes any sense in English!) or a full legal case.

The "game" now is to make sure that the same mistakes that was made, and the same wrong things that was said in the Bailiff’s Court is not repeated in the next steps of the case.

That's one of the reasons the coverage and discussions around the world are so important. It is important to get people to understand what it is the newspapers want (not saying that they will get it their way in the end). What they want is not the kind of Internet I want and I am ready to argue for what I believe is right :)


3:35 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member

*sigh* Late to the party - been on vacation for a few days. Thanks for spoiling it with this nonsense!

Thanks for the first hand report from the court room though Mikkel.
I totally agree that was a very sad day in Danish Internet history.

From the very beginning of this case, I have felt that the rest of the World must think that we complete morrons in Denmark. It's unbelievable that this case try to force the very nature (linking) of the web away.

The statement from the newspaper B.T. basically says it all - they don't want people at their online site because they lose money. Gimme a break!
Here you have a site that litterally sends hundreds of users to your articles every day and you don't want that? I bet the advertisers on the newspapers sites love it?

Well, I could go on but it would probably just be one large rant.

There's something rotten in the state of Denmark!
...and it's not the famous pastry.

<-- Going back to beach, which is one of the nice Danish things left (yet)


5:34 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member chiyo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Mikkel and others, Thanks for your very astute analysis. The key point of course is that we are still very much in the first stages of a power game of the established publishing oligarchies vs self publishing via the Web.

While the current debate may revolve around issues about "using other's content", the real overiding debate is over the ability of the Web to allow anybody to publish and distribute information - a right previously "owned" by the traditional print and broadcast publishers - on the basis of high entry costs and horizontal monopolies. That was why the AOL/Time Warner merger was so significant as a startegy for the latter to maintain their power as publishers in a new medium.

The publishing industry first of all pooh-pooh'ed and failed to see the threat the Internet posed for their industrial oligarchy (a change in the way info is provided that can potentially cut out the middle man - the middle man being the whole raison detre of publishing)

After a while they did start to see it - their response was to jump into the Web and try to monopolise it - when that started to be more difficult than expected (though not impossible in the long run) they start to want to change the web and medium itself using their old legal partners. To change it from a protocol which is based on the humble hyperlink to one which can be controlled and become just a repository of different origanization's paid for "spaces". That is a world they can control - its their familiar sandpit - the sharing hyperlinked basis of the Web however, is one they cannot control.

To me part of this is positive. The Web has not yet succumbed to the power of the current information elite.


7:11 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Wired has posted an interesting followup article:



7:19 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member chiyo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

interesting link..

It seems to be that the key legal point is ".. a violation of the Danish Copyright Act and the Danish Marketing Act, which forbids profiting by use of other companies' products and/or services..."

That does seem a curious law however. Im thinking of what changes that would mean, off and on line, if it was law in other countries.


7:23 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The lawsuit and the judgement (as I understand it) are so out of touch with internet realities, it boggles the mind. If they want to be a serious presence on the web, they should be rejoicing with every new link to their content! Where are their marketing people, for heaven's sake????

Mikkel Svendsen

8:17 pm on Jul 8, 2002 (gmt 0)

10+ Year Member

The marketing law is - as far as I understand, not being a lawyer - a special Danish law, that will probably not affect the rest of the world much but the copyright laws are not much different from what the rest of our world is using so at least if they win the final case that will for sure inspire other companies in all parts of the world to do try to do the same.


8:30 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Administrator ianturner is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I think that the traditional print media is on the way out - how many people under 25 do you see buying newspapers. Personally I haven't bought a paper in months, I no longer have the need to do so I can get all the information I want from the web.

This case signals the dawning on part of the media industry that they [big]NEED[/big] to make money from their websites and they are now trying to put in place protections on that web income.

You can already see the big players - the real newpaper barons moving into the web and digital and satellite TV in a big way.

Newbooster is in itself probably an endagered species, as it really is only using the traditional media which is in general horizontal in its approach to information and applying filters to allow each user to create their own vertical news portal. In the longer term you will probably see the providers turning to specialist vertical news portals to provide their customers with the news feeds that they want.


1:06 am on Jul 9, 2002 (gmt 0)

10+ Year Member

The marketing law is - as far as I understand, not being a lawyer - a special Danish law

Well, it doesn't look too different from the Norwegian marketing legislation. ;) I bet the Swedes got one too.


4:06 pm on Jul 9, 2002 (gmt 0)

10+ Year Member

Ah, I believe I found an English translation of The Danish Marketing Practices Act [fs.dk]

Probably doesn't help you much, as the deeplink injunction only refers to §1 and then you need to know the legal definition of "good marketing practices" in Denmark.

Mikkel Svendsen

5:58 pm on Jul 9, 2002 (gmt 0)

10+ Year Member

Yes, it is a _very_ broad paragraph - and I am not sure how many other countries that have a paragraph like that.

The funny thing is, that it is the same paragraph that is being used in a situation involving the Danish newspapers as the "Tip a Friend" function that many sites have - including the newspapers sites, seems to be iilegal according to the same paragraph in the law.


6:09 pm on Jul 9, 2002 (gmt 0)

WebmasterWorld Senior Member nffc is a WebmasterWorld Top Contributor of All Time 10+ Year Member

An interview with the CEO of Newbooster.com and our very own Mikkel, conducted by WebmasterWorld member Nick_W [webmasterworld.com]



4:10 am on Jul 11, 2002 (gmt 0)

10+ Year Member

Actually, the web community should be overjoyed. The only sites truly stupid enough to refuse deep links will be those owned by megacorp, inc.

Seeing that banning links is the metaphorical equvilent of moving your office from the high street and placing it in a disused field, the switched-on, linked-up web community (yes,us!) stand to inherit masses of ex-megacorp traffic. Hurrah.


11:49 am on Jul 26, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member

Here we go again, it is not over yet. Lately a ruling by a German court prohibited a website to do the excact same thing:

The ruling is the latest legal decision in a two-year battle between German newspaper Mainpost and German search service NewsClub. Mainpost charges that NewsClub violated the law by searching through and linking directly to Mainpost content.

The interesting thing about this case, is that the judge used general Europe law, and not local country legislation:

Recently, linking directly to other websites' content has increasingly come under legal fire in Europe, but most of the cases have centered on a specific country's law. The NewsClub case is based on a law common to the entire European Union.

Source: Wired [wired.com]

Well, well, you need to be carefull who you link to - at least in Denmark and Germany.


12:18 pm on Jul 26, 2002 (gmt 0)

10+ Year Member

Well, well, you need to be carefull who you link to - at least in Denmark and Germany.

That's not the issue. I've seen no rulings against single deeplinks. The problem seems to be collections of deeplinks.

Mikkel Svendsen

12:35 pm on Jul 26, 2002 (gmt 0)

10+ Year Member

Good you added that one here, Rumbas. Thanks :)

What makes the German case even worse is that the decision was made on a much higher level in the legal system than the NewsBooster case - so far. I am very sure that the Danish newspapers will use the German case in the continuing fight here.

I am not a lawyer and I suppose the judges know what they are doing (I hope so), so if they interpret the laws we have now in the way they do it’s probably ok, but then the laws needs to be changed.

I had the pleasure of talking with the IT-spokesman for our governing party in Denmark. He completely agrees that search and almost any kind of linking should be legal – and if it’s not now, he will do what he can to change the laws. He will meet up with the other IT-responsible politicians from the other parties right after the summer.

Maybe this is not a fight we should expect to win in court. Maybe we need to have the laws changed and in that case we better start making some good friends within the governments in our countries to make sure they understand how serious this is. All politicians in most parts of the world today want to focus on IT. They know how important the Internet has become and they want their share of the IT-future. They cannot afford not to focus on IT and the Internet.

It is our job to teach the politicians about the importance of free linking and good search.

Mikkel Svendsen

12:37 pm on Jul 26, 2002 (gmt 0)

10+ Year Member

hstyri, in the Danish ruling it is made very clear that a link is the same as making a copy. If the copyright laws we have now can be interpreted that way then they need to be changed :)


2:30 pm on Jul 26, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member

>It is our job to teach the politicians about the importance of free linking and good search.

Agreed. If they can't figure it out themselves, we have to point them in the right direction.

hystri, I agree. One link wont harm you, but what about say 5, 10 or maybe 100? Would that constitute colletion of links and thereby breaking the law.
It would be very close in the danish interpretation of the law..

This 59 message thread spans 2 pages: 59

Featured Threads

Hot Threads This Week

Hot Threads This Month