The 153 year old US publication, the Atlantic, this week introduced a new standalone site composed of aggregated material: the Atlantic Wire.
Content by columnists and commentators leading the national dialogue will be incorporated into one feed and site, an introductory post said.
“Our team – editors, reporters, researchers, data-crunchers -systematically tracks hundreds of opinion-makers from newspapers, web sites, television, radio, and magazines.”
The post lays out the four parts of the Wire site:
Breaking Opinions: op-eds, blog entries, and other commentary.
Features: Various categories including ‘Spatwatch’, ‘Strange Bedfellows’, ‘Cliché Watch’, and ‘Who Won the Day’.
The Ticker: Twitter style updates, in 140 characters or less.
The Atlantic 50: A list of the most influential columnists and commentators, as decided by the Wire.
Who? Multimedia manager at Hot Tomato Broadcasting Company
What? Currently working as multimedia manager for 102.9FM Hot Tomato, Australia. Her job as part of the Multimedia team at Hot Tomato is to provide people with the entertainment and information they would normally get from the station, in digital formats like the website www.1029hottomato.com.au, and social on sites such as Twitter and Facebook.
Just as we like to supply you with fresh and innovative tips every day, we’re recommending journalists to follow online too. They might be from any sector of the industry: please send suggestions (you can nominate yourself) to judith or laura [at] journalism.co.uk; or to @journalismnews.
Adam Tinworth looks at the positives and negatives of covering a live event with Twitter – and comes out with some handy suggestions for any journalists looking to use the tool for live coverage.
“The real time web is important, and significant. But that doesn’t mean that the old web, the archived, static web, isn’t still of value. Twitter coverage is dispersed, and fades away as the moment passes. Archive content has real utility as reference and grist for the conversational mill in the weeks that follow.”
Cast your minds back a couple of months: Lord Drayson, the UK’s science minister, proclaimed that British science journalism was in a pretty good state.
Drayson said the days when science was blighted by a press interested only in ‘scare stories’ are over,’ Times Higher Education (THE) reported in July 2009.
Most coverage of science by the media is now balanced, accurate and engaging, Lord Drayson argued, in a debate at the World Conference of Science Journalists.
But not everyone agreed. After Ben Goldacre – Guardian columnist, BadScience blogger/author and medical doctor – aired his conflicting opinion on Twitter, a public discussion was arranged by the Royal Institution. And tonight’s the night. If you haven’t got a ticket, it’s too late (it sold out in 90 minutes, Press Gazette noted) but you can watch the live video here on the THE website:
Courtesy of Silicon Alley Insider’s ‘Business Insider’, a chart showing that 24 per cent of US newspapers do not use any digital delivery platforms to spread their online content.
“The American Press Institute asked 2,400 newspaper executives if their papers ‘provide access to stories or information such as sports scores, headlines, stock quotes, etc.,’ via Twitter, Facebook, Email alerts, Mobile/PDA, YouTube, Kindle, Flickr, e-readers, etc., and told them to ‘check all that apply.'”
24 per cent of all respondents answered ‘None at this time’.
Citizen journalism site www.allvoices.com has introduced live event and location-specific Twitter data into its reports, the site announced in a press release yesterday.
The site will now display the latest tweets relating to news stories by location (city, country and region). Around reports from its users, it will include Twitter updates relating to that particular event or news item.
For ‘mainstream’ news reported on the site, Allvoices will now include aggregated tweets relating to those reports to show the conversation around the news.
The addition of the Twitter data will help the site vet stories for authenticity, Allvoices explained. It will be used to provide additional context and rank reports in Allvoices’ ‘Breaking’ and ‘Popular’ news categories. The integration will also add a real-time element to the site’s news stories.
The Twitter data will supplement the existing vetting procedure, which pulls together related content from mainstream news and user-generated sources, like videos, blogs and pictures, to attempt a ‘360-degree view of the news’.
“Twitter alone as a source for news doesn’t have the ability to tell a full story. Allvoices delivers the full story for a report plus a deeper understanding of the conversations going on around that event. What’s great about the system we’ve built is that it can take virtually any data source and apply it to user-generated and mainstream news reports,” said Dr Sanjay Sood, chief technical officer for allvoices.com, in the release.
Glyn Mottershead teaches newspaper journalism at the University of Cardiff. He blogs at http://egrommet.net/ and is @egrommet on Twitter.
Journalism will survive – but there’s no simple solution for how it gets there, or who is going to pay for it. That was the key message that underpinned the Future of Journalism conference at the Cardiff University School of Journalism, Media and Cultural studies last week.
Delegates from 42 countries gathered in the city to hear over 100 papers looking at the industry from a range of aspects:
New media technologies, blogs and UGC;
Sources; Ethics; Regulation; and Journalism practice;
Global journalism;
Education, training and employment of journalists; History
Curran’s plenary focused on different views of the future: the survivalists, the new media romantics and those who believe there is a crisis of democracy afoot.
Being passive is not an option for the industry or academics, he argued. It is futile to try and predict the future: the focus should be on moulding and shaping the future where the two can work together to keep journalism alive.
Bettina Peters of the Global Forum for Media Development questioned whether it was appropriate to try and export business models from the developed world to the developing world. She discussed the need for collaboration between the northern and southern hemispheres. Journalism needs to be looking at mixed funding models, she said.
She too was concerned that journalists and educators needed to engage in a global discussion to share ideas and solutions and that the conversations shouldn’t just be about money or tools – two key strands of current industry discussion both on- and off-line.
Conference organiser Professor Bob Franklin, of the Cardiff School of Journalism, Media and Cultural Studies, was keen to stress that this wasn’t an academic talking shop – but a key place where journalists and those studying journalism can get together to share research and ideas from around the globe, something crucial given the massive changes taking place in the industry.
His view was that the conference showed there is no single future for journalism. This was echoed in roundtable talks with journalism educators who were finding it difficult to determine what media organisations need, while journalists in the room stated that the media didn’t know what it wants.
Professor Franklin, like many others at the conference, believes the key to the future of journalism depends on the platform and location: while newspapers are in decline in Europe and America they are thriving in India, and there is a rise in daily tabloids in urban South Africa – with a thriving market in used copies of newspapers.
“The conference was about the future of journalism, and that future looks very different from where you are standing,” said Franklin. “We were talking about possibilities, not about sowing gems of wisdom. There is a future for journalism, but it is a very expansive future.”
Video: Professor Alfred Hermida on the Future of Journalism
On Wednesday (September 9), Paul Bradshaw, course director of the MA Online Journalism at Birmingham City University and founder of HelpMeInvestigate.com, chaired a discussion on data and the future of journalism at the first London Linked Data Meetup. This post originally appeared on the OnlineJournalismBlog.
The panel included: Martin Belam (information architect, the Guardian; blogger, Currybet; John O’Donovan (chief architect, BBC News Online); Dan Brickley (Friend of a Friend project; VU University, Amsterdam; SpyPixel Ltd; ex-W3C); Leigh Dodds (Talis).
“Linked Data is about using the web to connect related data that wasn’t previously linked, or using the web to lower the barriers to linking data currently linked using other methods.” (http://linkeddata.org)
I talked about how 2009 was, for me, a key year in data and journalism – largely because it has been a year of crisis in both publishing and government. The seminal point in all of this has been the MPs’ expenses story, which both demonstrated the power of data in journalism, and the need for transparency from government. For example: the government appointment of Sir Tim Berners-Lee, the search for developers to suggest things to do with public data, and the imminent launch of Data.gov.uk around the same issue.
Even before then the New York Times and Guardian both launched APIs at the beginning of the year, MSN Local and the BBC have both been working with Wikipedia and we’ve seen the launch of a number of startups and mashups around data including Timetric, Verifiable, BeVocal, OpenlyLocal, MashTheState, the open source release of Everyblock, and Mapumental.
Q: What are the implications of paywalls for Linked Data?
The general view was that Linked Data – specifically standards like RDF [Resource Description Format] – would allow users and organisations to access information about content even if they couldn’t access the content itself. To give a concrete example, rather than linking to a ‘wall’ that simply requires payment, it would be clearer what the content beyond that wall related to (e.g. key people, organisations, author, etc.)
Leigh Dodds felt that using standards like RDF would allow organisations to more effectively package content in commercially attractive ways, e.g. ‘everything about this organisation’.
Q: What can bloggers do to tap into the potential of Linked Data?
This drew some blank responses, but Leigh Dodds was most forthright, arguing that the onus lay with developers to do things that would make it easier for bloggers to, for example, visualise data. He also pointed out that currently if someone does something with data it is not possible to track that back to the source and that better tools would allow, effectively, an equivalent of pingback for data included in charts (e.g. the person who created the data would know that it had been used, as could others).
Q: Given that the problem for publishing lies in advertising rather than content, how can Linked Data help solve that?
Dan Brickley suggested that OAuth technologies (where you use a single login identity for multiple sites that contains information about your social connections, rather than creating a new ‘identity’ for each) would allow users to specify more specifically how they experience content, for instance: ‘I only want to see article comments by users who are also my Facebook and Twitter friends.’
The same technology would allow for more personalised, and therefore more lucrative, advertising. John O’Donovan felt the same could be said about content itself – more accurate data about content would allow for more specific selling of advertising.
Martin Belam quoted James Cridland on radio: ‘[The different operators] agree on technology but compete on content’. The same was true of advertising but the advertising and news industries needed to be more active in defining common standards.
Leigh Dodds pointed out that semantic data was already being used by companies serving advertising.
Other notes
I asked members of the audience who they felt were the heroes and villains of Linked Data in the news industry. The Guardian and BBC came out well – The Daily Mail were named as repeat offenders who would simply refer to ‘a study’ and not say which, nor link to it.
Martin Belam pointed out that the Guardian is increasingly asking itself ‘how will that look through an API?’ when producing content, representing a key shift in editorial thinking. If users of the platform are swallowing up significant bandwidth or driving significant traffic then that would probably warrant talking to them about more formal relationships (either customer-provider or partners).
A number of references were made to the problem of provenance – being able to identify where a statement came from. Dan Brickley specifically spoke of the problem with identifying the source of Twitter retweets.
Dan also felt that the problem of journalists not linking would be solved by technology. In conversation previously, he also talked of ‘subject-based linking’ and the impact of SKOS [Simple Knowledge Organisation System] and linked data style identifiers. He saw a problem in that, while new articles might link to older reports on the same issue, older reports were not updated with links to the new updates. Tagging individual articles was problematic in that you then had the equivalent of an overflowing inbox.
Finally, here’s a bit of video from the very last question addressed in the discussion (filmed with thanks by @countculture):