Menu
Browse > Home /

Professor suggests 24-hour delay before aggregators can link to content

Suggestions for changes to copyright law posted on the Business Insider by a US university professor and lawyer have come under fire after proposing that the direct reposting of news content from a weekly title online should be banned for a week following publication.

The article suggests that declines within the newspaper industry could be improved if intellectual property rights were to undergo “rethinking”.

Using aggregators like Google and others, I can access essentially in real time the lead paragraphs of almost any story from the New York Times, the Washington Post, or indeed any other major news service. Not surprisingly, traditional print media publications are dying, and not surprisingly their owners’ online dotcom alternatives are generating far too little revenue to pick up the slack; why pay for any content when the essence of everything is available immediately, and free, elsewhere.

The writers Eric Clemons and Nehal Madhani add that one solution could be to apply a waiting time on articles before they can be reposted online by external aggregators, unless it is only in commentary on the work.

A first suggestion would be to provide newspaper and other journalistic content special protection, so that no part of any story from any daily periodical could be reposted in an online aggregator, or used online for any use other than commentary on the article, for 24 hours; similarly, no part of any story from any weekly publication could be reposted in an online aggregator or for any use purpose other than commentary, for one week.

But these proposals have been strongly opposed by online news sites such as Techdirt.com, who said the issues facing newspapers is not the fault of news aggregators.

Revenue from those publications has been in decline for many years — well before Google and the internet existed. The biggest problem many of the bigger publications faced was taking on ridiculous debt loads. On top of that, most of them failed to provide value to their community, as competitors stepped in to serve those communities. That’s not about aggregators.

Tags: , , , , ,

Similar posts:

‘The imperatives of the news cycle’: A licence to steal?

Last week we highlighted some of the criticism being directed at Rolling Stone magazine for its decision to hold off publishing the now notorious General McChrystal article online.

The magazine’s hold-for-the-newsstand tactic led Time.com and Politico to make full PDF copies of the printed article available through their websites – copies which were not provided directly by Rolling Stone, as was first thought, but by third parties.

In the wake of Rolling Stone’s much-derided decision, New York Times’ Media Equation blogger David Carr turns his attention to the behaviour of Time.com and Politico, which later linked back to Rolling Stone’s website when the magazine finally published online.

Publishing a PDF of somebody else’s work is the exact opposite of fair use: these sites engaged in a replication of a static electronic document with no links to the publication that took the risk, commissioned the work and came up with a story that tilted the national conversation. The technical, legal term for what they did is, um, stealing.

Jim VandeHei, executive editor and a founder of Politico, defended the site’s move by claiming that “the imperatives of the news cycle superseded questions of custody”.

Full story at this link…

Tags: , , , , ,

Similar posts:

Going back to the backlink licensing case: NLA’s full statement

June 26th, 2009 | 6 Comments | Posted by in Legal, Online Journalism

This goes back to last week, but it seems worth putting up here anyway. Last Thursday Matt Wardman covered this story for Press Gazette: about the Newspaper Licensing Agency regulating hyperlinks for commercial agencies and aggregators.

“The NLA will be introducing a new form of licence from 1 September to regulate ‘web aggregator’ services (such as Meltwater) that forward links to newspaper websites and for press cuttings agencies undertaking this type of activity.”

Craig McGill also picked up on it and asked a series of provocative questions. He got a lengthy response from the NLA, including this:

“This is not about bloggers adding links to newspaper sites. Our focus is on professional media monitoring organisations (news aggregators, press cuttings agencies) and their client business who make extensive use of the newspaper content.”

More questions are raised in the comments beneath McGill’s piece, including this one about copyright law.

Last Friday Journalism.co.uk spoke to the NLA who said it was part of their new e-Clips service – ‘a feed of newspapers’ online content direct to cuttings aggregators and press cuttings agencies.’

Here’s the NLA statement in full:

“The Newspaper Licensing Agency (NLA) today [dated June 2009] announced a new business-to- business clippings database for newspaper websites to launch in January 2010. It also has said it will extend its licensing remit to cover newspaper websites from January 2010.

“The new service, called eClips web, will offer a complete feed of newspapers’ online content direct  to cuttings aggregators and press cuttings agencies. Powered directly from newspapers’ own content-management systems, eClips web will make web-based media monitoring faster and richer and provide a permanent record for PR and communications professionals.

“The NLA will also extend its licensing remit to cover local and national newspapers’ web content. David Pugh, managing director of the NLA, said: “We have two aims: to contribute to the growth of web monitoring; and to protect the rights of publishers. Research shows that 23 per cent of newspapers’ online content never appears in print and that the internet is growing in influence as a resource for news. So it is vital to have comprehensive monitoring coverage of newspapers’ websites – and vital that the publishers are properly rewarded for their work.”

“From September 2009, web aggregators that charge clients for their services will require a NLA licence and be charged from January 2010, The press cuttings agencies that either ‘scrape’ content themselves or buy in services from aggregators will also be licensed and charged. Client companies that receive and forward links from these commercial aggregators within their organisation will also require a licence.

“David Pugh added: “We have consulted extensively across the industry – the incremental charges for web cuttings will be low and manageable. I stress this is not about individuals sharing links – we think that’s great for newspapers and promotes their websites and their readership.  What we are doing is making sure that newspapers are rewarded fairly for professional use of their web content by businesses.””

Further notes:

“The NLA is owned by the 8 national newspaper publishing houses and generates B2B revenues for
1,300 national and regional publishers through licensing use of their content by press cuttings
agencies (PCAs) and their client companies.

“The new licences will cover all local and national titles with the exception of the Financial Times and
the News International titles. These will all, however, be included in the eClips web database.”

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , ,

Similar posts:

PhotoAttorney: UK rejects copyright law amendments as result of UGC / mash-ups

Following up on Out-Law.com’s reports, PhotoAttorney.com looks at the UK government’s decision not to revise its copyright law ‘after reviewing research results in a consultation paper (…) published by the European Commission (EC) on copyright reform’.

“In sum, the EC paper considered the possibility of creating new use exemptions to address the increased creation of ‘user-created content’ (UGC), also known as ‘mashups’, made easier by technological innovations.”

Full post at this link…

Tags: , , , , , ,

Similar posts:

Belgian newspapers seeking £39m damages from Google

May 28th, 2008 | 1 Comment | Posted by in Newspapers, Search

A group of Belgian newspapers are seeking up to £39m (€49m) in damages from Google for the search giant publishing and storing their content without permission or offering payment.

Last year Google lost a case brought against it by the Copiepresse group – an organisation that represents the French language press in Belgium – forcing it to remove cached versions of newspaper articles and take down content from its Google News service

The organisation’s secretary-general told Bloomberg yesterday it had summoned Google to appear again before a Brussels court so that it could decide on the damages. Copiepresse is seeking between €32.8 and 49.1m.

The damages would be in addition to the €25,000 (£20,000) daily fine imposed on Google by the court for each day it kept Copiepresse material on its site.

Google appealed the original court decision of February 2007, which ruled that it could not claim ‘fair use’ – acceptable under copyright law – for using a lines of text and linking to the original article.

A Google spokesperson told Bloomberg that it was still awaiting the results of its appeal and that it had not received notification from Copiepresse of any new court dates.

Tags: , , , , , , , , , , , , , , ,

Similar posts:

© Mousetrap Media Ltd. Theme: modified version of Statement