homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
Forum Library, Charter, Moderators: not2easy

Content, Writing and Copyright Forum

This 35 message thread spans 2 pages: 35 ( [1] 2 > >     
New York Times Sued by Newspaper Chain Over Linking Practices

 2:14 pm on Dec 24, 2008 (gmt 0)

story [ecommercetimes.com]

I'm not sure how many of these suits have been filed over the years but eventually one will go to trial (instead of being settled out of court) and we will all get some real law on the linking issue.



 2:45 pm on Dec 24, 2008 (gmt 0)

This seems to be about linking alone, not about a text snippet with a link.

If they win, that's bad news for webmasters. Would we have to ASK every site we want to link to for permission?

I don't see Gatehouse winning.


 3:04 pm on Dec 24, 2008 (gmt 0)

This is the interesting part, and -I hope- the sole basis for the suit:
GateHouse's suit also alleges that the New York Times Co. circumvented security measures meant to block it from linking to GateHouse-owned content.

"With the allegation of security protocol circumvention, it would not surprise me to see GateHouse also allege violation of the Digital Millennium Copyright Act..." Van Dyke speculated.

Barring provable violation of properly-implemented and thorough Gatehouse "security measures" to prevent NYT access to their sites and an accessible and clear Terms of Use page, I hope that Gatehouse understands that if they don't want links to their content, then they should either go to a subscriber-only (login-required) model, or simply take their content off the Web; The "fee" for the otherwise-free use of the Web for content publishing is to allow linking, and linking in the whole point of calling it "the Web."

Hey, if they don't want those authoritative inbounds from NYT, I'd be glad to have them!

The above post represents my personal opinion, and not that of WebmasterWorld.



 3:44 pm on Dec 24, 2008 (gmt 0)


I agree that the security protocol is probably the crux of this case. Which raises an interesting question of how this could be done and maybe give some advice to others as to what to avoid. Here are a couple of things they may have done with the link but I'm sure there are more:

  • Included personal login identifier in the URL
  • Bypassed session ID
  • Linked to the print version


 4:37 pm on Dec 24, 2008 (gmt 0)

Well, it can't *really* be done effectively, which is why I emphasized "properly-implemented and thorough Gatehouse security measures" in my post. They may have blocked the NYT corporate IP address ranges, and then some NYT reporter logged-in from home or from a wi-fi hotspot at the corner coffee shop and got a link... Who knows.

The Web is an "open" space, and trying to prevent linking by blocking access is rather futile, IMO. The best they can do is to publish clear and enforceable Terms of Use, and have some prior formal usage agreement with the NYT -- either in writing, or as part of a mandatory user-agreement agreed-to on signing up for access to their sites. Maybe they did that, but such details were missing from the article and it's impossible to tell.

If the "first contact" they've had on this matter is through the courts, then I'd say it's likely just an effort to bolster their finances in hard times. Too bad they apparently don't recognize the value of those inbounds when considering their traffic and revenue...

It will be interesting to hear the details on this case. Without them, it's hard to tell who's right and who's wrong. My interest here is simply to support free linking and Fair Use under existing copyright law.

The above post represents my personal opinion, and not that of WebmasterWorld.



 4:56 pm on Dec 24, 2008 (gmt 0)

Here's info from Chilling Effects on linking [chillingeffects.org].

This wouldn't be the first lawsuit regarding linking, so there are already court precedents.


 7:22 pm on Dec 24, 2008 (gmt 0)

I posted something about this issue in February... [webmasterworld.com ] ... and remember thinking how ridiculous it was to worry about getting written permission for links, and wondering if something like that would ever gain traction.

Well, fast-forward to eleven months later, and my firm has tasked me with obtaining signed "linking agreements" from organizations to whose web sites we link.


 6:58 pm on Dec 25, 2008 (gmt 0)

Would basically end Drudge's site..


 4:54 pm on Dec 29, 2008 (gmt 0)

Please correct me if I am wrong... but if you publish an article and make it public for all to read, current copyright laws allow someone to cite and reference your content. There is no law being broken here.

If you want your content private, keep it private. Don't make it public. Maybe we need to offer a class on how the internet works.

Sounds like someone is stirring up trouble to get a quick boost to their traffic. In the long run, this is going to hurt them more than their 15 minutes of fame.


 5:08 pm on Dec 29, 2008 (gmt 0)

Maybe we need to offer a class on how the internet works.

I have often said that people should be required to pass a test to get a license before being allowed to drive on the information superhighway. Such a class should certainly be required before allowing anyone to file an Internet-related lawsuit.


 5:24 pm on Dec 29, 2008 (gmt 0)

The engines should take the preventative measure and drop all such companies who file suits like this from the index.


 6:42 pm on Dec 29, 2008 (gmt 0)

The engines should take the preventative measure and drop all such companies who file suits like this from the index.

Better yet, ISPs should block all access to these sites unless you specifically type the address in to the address bar, and browsers should prevent you from bookmarking these sites. A bookmark is, after all, nothing but a glorified link.

Tongue firmly planted in cheek


 8:14 pm on Dec 29, 2008 (gmt 0)

I'd love to be an expert witness for the NYT in this case as GateHouse Media has done nothing to protect their copyrighted material, regardless of what they claim.

Google and Yahoo has it in CACHE, they aren't using NOARCHIVE, so it's wide open to the world.

If they think robots.txt is blocking the NYT, that's a major fallacy as robots.txt has always been an optional standard to follow, not mandatory.

I'd be real curious what they're doing technically, but their content is scattered all over the place so crying foul when you have set it free doesn't make sense and the case should be tossed on that merit alone.

Slap in some NOARCHIVE meta tags and a few other things to make it look like you're really trying to control your content and maybe it's a better argument.


 10:56 pm on Dec 29, 2008 (gmt 0)

Wow almost 2009 and people still operate major companies without a basic knowledge of the tools they use.

If a company wants to have a web presence then they should take the time to decide what they want to make public, and then they need to understand what exactly that means.

Linking sites is common practice, and in the news industry it is a way of referencing sources. Referencing sources has been a standard since publications first came out in print.

I will never agree that you should be able to restrict who links to your PUBLIC pages.


 11:05 pm on Dec 29, 2008 (gmt 0)

I will never agree that you should be able to restrict who links to your PUBLIC pages.

How about when some pedo site links to your day care center?

I had something similar to that happen which cost me a few advertisers once and they didn't come back even after I had the site in question slapped and my references removed.


 11:27 pm on Dec 29, 2008 (gmt 0)

Yes well This is a time where the Associated Press and their dying membership is trying to hijack widely accepted fair use principles and recast established law and practice.

Support those companies that have responsible and reasonable management. UPI, Reuters, AHN all good.

AP, Mcclatchey, and other dead tree media... all evil.


 11:40 pm on Dec 29, 2008 (gmt 0)

I was searching for some new domain names yesterday and ran across a site that had the following at the bottom of their home page:

Content of the site intended for informative and referential use only. Storage or reproduction of the site in any form and/or the creation of links to and from the site is forbidden without the written consent of [ Company Name ]. All Intellectual Property rights are asserted and the work remains the possession of [ Company Name ] partners and those third parties with whom it was developed. Unauthorised use, distribution, publication, transmission and/or alteration of the site and its content is strictly forbidden.

I wondered how in the hell they plan to enforce this? Do they have a legal leg to stand on if someone links TO their site without 'written consent'?

[edited by: tedster at 1:13 am (utc) on Dec. 30, 2008]
[edit reason] make information anonymous [/edit]


 11:50 pm on Dec 29, 2008 (gmt 0)

Do they have a legal leg to stand on if someone links TO their site without 'written consent'?

Not sure, but they do probably have solid standing in regards to someone copying a big chunk of text from their site without permission.

As I assume you just did?


 12:35 am on Dec 30, 2008 (gmt 0)

Pure speculation and sensationalism on the part of the journalist and editor. Like Winnie the Pooh, this is a bear with very little brain, er... story and substance.

The telling part is the bit which show that the two players are not interested in the deliberate link-baiting and warmongering tone of the piece:

Officials from GateHouse and the New York Times Co. could not be reached for comment.

Not surprised as there aren't any facts quoted in the story, only gossip enough to get some link action.

Is there any decent reportage of the actual case itself?



 9:56 am on Dec 30, 2008 (gmt 0)

How about when some pedo site links to your day care center?

Only the readership of the pedo site would know...

...and the most obvious solution is to 404 anything with a referrer from the pedo site.

I think advertisers who drop you because of who links to you simply need to learn about the web. I do not know of any ad network that does this.

Are you really suggesting we should cripple how the web works in order to deal with the reaction of people who do not get the web to unusual and extreme circumstances? Perhaps we should have to provide proof of identity to every site we visit as well?


 12:57 pm on Dec 30, 2008 (gmt 0)

For all that has been said, I agree that it comes down to what they mean by "circumvented security measures meant to block it". IncrediBILL seems to suggest that Google and Yahoo has it in CACHE, and they aren't using NOARCHIVE, so are they really basing their whole arguement, and the above statement about "security measures" on a robots.txt or is there something more? With the legal advice they are or should be getting, I wonder if there is more to it than we know..


 3:04 pm on Dec 30, 2008 (gmt 0)

I feel like there's a significant missing piece from this article. Other stories on the 'net about this particular lawsuit specifically refer to the fact that Boston.com is linking to GateHouse's internal pages and GateHouse has a lot of ad inventory on their homepage. That said, the NYTimes is being sued (I believe, from what all other articles have said) for circumventing the primary advertising for the plaintiff's site.

I don't think the practice of linking itself is the problem. I think it is the methodology applied here in terms of this particular suit.


 3:33 pm on Dec 30, 2008 (gmt 0)

Other stories on the 'net about this particular lawsuit specifically refer to the fact that Boston.com is linking to GateHouse's internal pages and GateHouse has a lot of ad inventory on their homepage.

"Deep linking" is the most fundamental concept of the Web. It's the equivalent of academic or authors' citations that point the reader or researcher to a specific page. Unless the courts outlaw the Reader's Guide to Periodical Literature and The Magazine Index, it's hard to see how they'd outlaw deep linking.


 9:24 pm on Dec 30, 2008 (gmt 0)

This is what happens when print laws are applied wholesale to the web world.

If a portion of online publishers want to have sites asking for permission and even pay a fee for linking to them, then let them. If they can afford loosing "word of blog" traffic others will be happy to take it off their hands.


 1:31 am on Dec 31, 2008 (gmt 0)

GateHouse Media has done nothing to protect their copyrighted material

From the legal complaint filed by Gatehouse:

Gatehouse implemented certain electronic security measures on Wicked Local, to prevent users with a certain Boston.com Internet Protocol ("IP") address from scraping content from Gatehouse's website. Plaintiff's security measures did not deter defendant in the least - defendant posted original content to the Infringing Website the very next day after they were installed.

This suggests that they did not do nothing, rather that they did something useless.

And as incrediBILL points out, search engine caches remained available.

And, of course, no security measure can stop manual copying.

There are other aspects to the case that may or may not be worthy of debate, but I will be pleasantly surprised if jury members have the mechanics of website "security measures" adequately explained to them.

My prediction: no lawyer will lose.



 2:48 am on Dec 31, 2008 (gmt 0)

Ouch - I hate to think how many "security measures I circumvent" each day then.


 3:46 am on Dec 31, 2008 (gmt 0)

So, if I'm having a chat w someone, happen to mention I've read something on Gatehouse (maybe never have though; never heard of them till now), am I violating their copyright law? Should ask permission beforehand, eh?

Print citations closer; citing sources of info and of further info standard practice, but not something Gatehouse would like.
- indeed best they take content, and file it away someplace inside a security vault, deep in mountains.

And, like jim, if anyone from NYT is reading this thread, I'd be happy to have some links from you. (Pretty please! ...please please please...)


 6:19 am on Jan 1, 2009 (gmt 0)

Quick question, has sanity left the building? I read the original story (dare I say), linked from eCommerce Times and saw this part about:

"Early on, at least, it looks as though GateHouse may have the advantage. "As long as GateHouse isn't publishing content for commercial use," Van Dyke said, "they seem to have the high ground at present -- that is, until the defendants answer the complaint, and we see their side of the story.""

Uhm, no. Who are these people at GateHouse (for that matter who is GateHouse - never heard of them and I have held a press related site since around 1996)? If you don't want people to link to your website, there are several reasonable and accepted approaches to solving the problem.

1) Close the site to members only content and block any web search engine from spidering your oh so precious content.

2) Failing that, simply don't put up a web site. That's right, just take that off ramp from the Information Superhighway and leave the rest of us alone. This crap over the exception dictating to the majority has got to stop.

The previous paragraph, "GateHouse is trying to exercise more control over its content than copyright law will probably allow it to do, Collins said, "but, you never know what a judge will find. Certainly, GateHouse is looking to press the limits of its copyright further than anyone else on the Net that I'm aware of."

I think that screams for my earlier stated option 2. From their apparent claim that folks should be paying them to link to them, one would think they should be pretty much banned from any search engine (paid links provision?). On its face, it would appear that this is a case of greed, pure and simple.

I would give the folks of GateHouse another idea, why not simply avoid putting content out on the street entirely. After all, some person might actually want to photocopy an article from their rag. Clearly, based on this current action, they would certainly want to seek to avoid that from happening, so just stop putting out a paper and the problem of content protection goes away.

Most people would agree that our legal system is far to clogged by this kind of drek, which seems to entirely stink of at least one ambulance chaser class "attorney" with far too much time to move forward with testing some weird a-- legal theory that harms far more than it helps, given the more obvious options they could take.

I'm no great fan of the New York Times, but I hope it is awarded both legal fees and damages.


 12:52 pm on Jan 2, 2009 (gmt 0)

docbird and Commerce- Gatehouse is a media holding company.. mostly middle and low market news paper sites.

Gatehouse is trying to do what the Associated Press is trying to do. Namely recast accepted fair-use principles and practices to prop up dying business models.

Dinosaur old media that would rather litigate than innovate.

You'll recall that not too long ago the AP caused an uproar over fair use. This is nothing different and this will likely NOT be the last... With dead tree papers dying a slow and agonizing death.

This is the type of corporate behaviour that makes media companies and mobs like Gatehouse, Associated Press truly evil. Instead of upholding freedom of information and speach they trample it in the name of profit. Truly evil.


 6:42 am on Jan 4, 2009 (gmt 0)

eventus - Bravo! Brief, informative and on point.

This 35 message thread spans 2 pages: 35 ( [1] 2 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved