Tag Archives: data

#bbcsms: Use data to inform newsroom decisions, says panel

“Numbers are everything to our business” – this was the message from Washington Post‘s Raju Narisetti, speaking today at the BBC’s social media summit.

Narisetti outlined the “simple mission” for news organisations to have more people to engage with more of its content, and this is achieved through data – both numbers and importantly, context.

We’ve moved from our anecdotal newsroom to a newsroom where there’s a lot more data, a lot more measurement. Initial measurement was page views, but we very quicky realised we need to move to a world of context.

Data is not just about measuring eyeballs – it is a valuable resource in making decisions. You’re able to show with some data things we can stop doing, Narisetti said, without making an impact on the readership. This he said makes an “accountable newsroom” and creates an environment which is a lot more encouraging for digital journalists where they know the impact of their work.

Also speaking on the panel, which covered the cultural challenges for newsrooms trying to encourage the effective use of social media, was the Guardian‘s Meg Pickard.

She revealed that research by the Guardian has shown that when a journalist gets involved in the conversation online it halves the moderation need and the tone of the conversation “goes up”. This is a key example of such data being used to support proposals and ideas.

As for the culture of the newsroom the Guardian wants to focus on people and skills, she said, to “create a fertile medium” across the organisation and then trusting staff to “act as the intelligent adults that they are” and apply their best knowledge and judgement to the situation.

But, she added, there’s no point in forcing anyone to be active on Twitter from the get-go.

We should not be forcing someone to Tweet, it will be obvious, they will be grumpy and won’t know what they’re doing. So I don’t think on your first day when you’re handed an email address they should be told that you’re free to say anything you like about our brand to the world.

Within the first few months I would try and encourage them to do so, but by demonstrating opportunities to build the community and relationship with audience.

Journalism.co.uk’s own digital journalism event news:rewired – noise to signal, which takes place on Friday next week at Thomson Reuters, will dedicate an entire session to the issue of audience data in informing editorial and business decisions for news organisations. You can find out more and buy tickets at this link.

David Higgerson: Journalists must keep pushing for open data

David Higgerson, head of multimedia for Trinity Mirror Regionals, has published the address he made about data journalism at the FutureEverything conference in Manchester last week, making some interesting points.

Higgerson says that for journalists the biggest challenge is going to keep “pushing” for data to become available.

Councils have to issue details of all spending over £500 – but some councils have decided to publish all spending because it’s cheaper to do so. As journalists, we should push for that to happen everywhere.

FOI is key here. The more we ask for something under FOI because it isn’t freely available, the greater the chance its release will become routine, rather than requested. That’s the challenge for today’s data journalists: Not creating stunning visualisations, but helping to decide what is released, rather than just passively accepting what’s released.

Read his post in full here…

Journalism.co.uk is running a one-day digital journalism conference looking at data in the news industry next week at Thomson Reuters. news:rewired – noise to signal will take place on Friday 27 May. You can find out more information and buy tickets by following this link.

NPR: Finding stories in a ‘sea of government data’

At the end of last week, NPR’s On The Media show spoke to Texas Tribune reporter Matt Stiles and Duke University computational journalism professor Sarah Cohen about how to find good stories in a “sea of government data”.

Listen to the full interview below:

Journalism.co.uk will be looking at open government data and the skills needed to find stories in datasets at its upcoming news:rewired conference. See the full agenda at this link.

#ijf11: Lessons in data journalism from the New York Times

Follow this link or scroll to the bottom to start by hearing more from New York Times graphics editor Matthew Ericson on what kind of people make up his team and how they go about working on a story

The New York Times has one of the largest, most advanced graphics teams of any national newspaper in the world. Yesterday at the International Journalism Festival, NYT deputy graphics editor Matthew Ericson led an in-depth two-hour workshop on his team’s approach to visualising some of the data that flows through the paper’s stories every day.

He broke the team’s strategy down in to a few key objectives, the four main ones being:

Provide context

Describe processes

Reveal patterns

Explain the geography

Here is some of what Ericson told the audience and some of the examples he gave during the session, broken down under the different headers.

Provide context

Graphics should bring something new to the story, not just repeat the information in the lede.

Ericson emphasised a graphics team that simply illustrates what the reporter has already told the audience is not doing its job properly. “A graphic can bring together a variety of stories and provide context,” he said, citing his team’s work on the Fukushima nuclear crisis.

We would have reporters with information about the health risks, and some who were working on radiation levels, and then population, and we can bring these things together with graphics and show the context.

Describe processes

The Fukushima nuclear crisis has spurned a lot of graphics work at news organisations across thew world, and Ericson showed a few different examples of work on the situation to the #ijf11 audience. Another graphic demonstrated the process of a nuclear meltdown, and what exactly was happening at the Fukushima plant.

As we approach stories, we are not interested in a graphic showing how a standard nuclear reactor works, we want to show what is particular to a situation and what will help a reader understand this particular new story.

Like saying: “You’ve been reading about these fuel rods all over the news, this is what they actually look like and how they work”.

From nuclear meltdown to dancing. A very different graphic under the ‘desribe processes’ umbrella neatly demonstrated that graphics work is not just for mapping and data.

Disecting a Dance broke down a signature piece by US choreographer Merce Cunningham in order to explain his style.

The NYT dance critic narrated the video, over which simple outlines were overlaid at stages to demonstrate what he was saying. See the full video at this link.

Reveal patterns

This is perhaps the objective most associated with data visualisation, taking a dataset and revealing the patterns that may tell us a story: crime is going up here, population density down there, immigration changing over time, etc.

Ericson showed some of the NYT’s work on voting and immigration patterns, but more interesting was a “narrative graphic” that charted the geothermal changes in the bedrock under California created by attempts to exploit energy in hot areas of rock, which can cause earthquakes.

These so-called narrative graphics are take what we think of as visualisation close to what we have been seeing for a while in broadcast news bulletins.

Explain geography

The final main objective was to show the audience the geographical element of stories.

Examples for this section included mapping the flooding of New Orleans following hurricane Katrina, including showing what parts of the region were below sea level and overlaying population density, showing where levies had broken and showing what parts of the land were underwater.

Geography was also a feature of demonstrating the size and position of the oil slick in the Gulf following the BP Deepwater Horizon accident, and comparing it with previous major oil spills.

Some of the tools in use by the NYT team, with examples:


Google Fusion Tables
Tableau Public: Power Hitters
Google Charts from New York State Test Scores – The New York Times
HTML, CSS and Javascript: 2010 World Cup Rankings
jQuery: The Write Less, Do More, JavaScript Library
jQuery UI – Home
Protovis
Raphaël—JavaScript Library
The R Project for Statistical Computing
Processing.org

An important formula 

Data + story > data

It doesn’t take a skilled mathematician to work that one out. But don’t be fooled by it’s simplicity, it underpinned a key message to take away from the workshop. The message is equally simple: graphics and data teams have the skill to make sense of data for their audience, and throwing a ton of data online without adding analysis and extracting a story is not the right way to go about it.

More from Matthew Ericson on the NYT graphics team

I spoke to Ericson after the session about what kind of people make up his team (it includes cartographers!) and how they go about working on a story.

Here’s what he had to say:

Listen!

The BBC’s Peter Horrocks on data journalism

I spoke to Peter Horrocks, who is director of the BBC World Service and the BBC’s global online news operations after the session about his take on data journalism and whether the BBC Global News had ambitions in this direction.

Here’s what he had to say:

Listen!

See the full list of links for Ericson’s session on his blog.

See more from #ijf11 on the Journalism.co.uk Editor’s Blog.

#media140 – Carlos Alonso’s favourite tools to finds stories behind the data

Here at Journalism.co.uk we understand data is one of the buzzwords in journalism at the moment, it is why we have built our news:rewired conference around the topic, and its popularity was certainly clear from the packed room at Media140 today, where journalist and online communications specialist Carlos Alonso spoke on the topic.

Alonso first discussed why the use of data itself is not new, illustrating this with the use of data in the 1800s to pinpoint deaths of cholera geographically, which then led to the finding that many occurred close to a specific well, or the mapping of revolutions in Scotland or England in 1786 to map where conflict was taking place.

The golden age of using data mining was in the 1700s and 1800s. It died out in the 20th century but is coming back again. It is now really strong, but nothing new.

This talk focuses on the first parts of the journalistic process, sourcing and processing of data to find stories. First you need to start with a question, he said, think about what you’re interested in finding out and from this you’ll know what data you need.

Once you have the data you must first clean it and figure out what the important data is, we’re looking for what is behind this. So then you need to treat the data, process the data … Now with the computer you can make the data interactive so you can go into greater depth and read behind the story if you want to, the end product can be very different to what you start with.

So where can you find data?

  1. Public institutions, open data and government data sets. Also private initiatives such as Open Knowledge Foundation or opengovernmentdata.org. This is verifiable data, he adds, from a reliable source. Telecommunications agencies also publish a huge amount of information that isn’t on open data but is available on their webpages.
  2. Commercial platforms, e.g. Infochimps, Timetric, Google public data explorer, Amazon Web Services Public Data, Manyeyes by IBM.
  3. Advanced search procedures/searching, e.g. using Google intelligent searching for Filetypes, or performing site searches.
  4. Scraping and APIs, e.g. Scraperwiki, Outwit, Scripts, Yahoo Pipes, Google spreadsheets. These offer “an entry portal to their server so that you can look for the data that you want”, he said.
  5. Direct requests.
  6. Creating your own databases, although this is “a huge amount of work and requires a lot of resources, but you can use the community to help you”, he added.

Alonso also offered a useful list of what news outlets often look for, and then display, in data: trends, patterns, anomalies, connections, correlations (although important to not assume causal effect), comparisons, hierarchy, localisation, processes.

BetaTales: Can the story of traffic accidents be told in a new way?

BetaTales takes a look at a new project based on traffic accident data from journalists and programmers at Norwegian media house Bergens Tidende.

Accidents are apparently common fare in the Western part of Norway, with frequent news reports of collisions on the region’s narrow, winding roads.

With this in mind, journalists at Bergens Tidende approached the Norwegian Public Roads Administration armed with the Freedom of Information Act, eventually getting access to a database of all road accidents in the country.

The database turned out to be a journalistic goldmine: It contained details about 11,400 traffic accidents all over the country, all neatly arranged in an Excel file. Not only did the database give the exact position of each accident, but it also included numerous details, such as how many were killed and injured, the seriousness of injuries, driving conditions, type of vehicle, type of street, speed limit, time of the day, etc.

Still, most journalists would at this point probably have been happy to take a look at the database, extract some of the relevant accidents and made a couple of news stories based on them. In Bergens Tidende, though, the journalists instead were teamed up with programmers. Within a few weeks all the traffic accidents in the country had been put on a big Google map with endless ways to search the database.

Full story on BetaTales at this link.

“Killing Roads” project from Bergens Tidende at this link (Norwegian).

Bergens Tidende multimedia journalist Lasse Lambrechts talks about “Killing Roads”:

Channel 4 News: Benjamin Cohen’s life torn open by Wired

Benjamin Cohen, technology editor at Channel 4 News, has blogged about the experience of being sent the latest, personalised edition of Wired magazine.

Well, personalised for some. “Opinion formers” around the UK have been sent a copy of Wired, titled “Your life torn open”, with personal information about them splashed over the front cover. Cohen was shocked by the information that they printed – and it is shocking at first. But then it is all publically available through Facebook, Twitter, Companies House and the Land Registry.

What’s shocking though is seeing all of this printed in black and white (or yellow in this case). Everything was available from Facebook, Twitter, Company House and the Land Registry but it shows the information is so readily available. It also shows how powerful these resources can be for private detectives or government agents.

Read his post in full here…

#cablegate: 7,500 cables tagged ‘PR and Correspondence’ could shed light on media relations

According to WikiLeaks, there are more than 7,500 embassy cables due to be released as part of its latest classified documents leak that have the tag OPRC or “Public Relations and Correspondence”.

Only two with these tag have been published so far – one is a round-up of Turkish media reaction and the other a summary of media reaction to news issues in China, the US and Iran, both sent in 2009.

But it’ll be worth keeping an eye on future cables tagged OPRC for information about diplomats and country leaders’ media relations and communications.

Until the text of these cables is made public, we don’t know just what they contain and how relevant it might be to media outlets. But using the Guardian’s data store of the cables, it’s easy to find out how many cables have been sent by which embassies during the time period covered by the leak –

The US embassy in Ankara, Turkey is responsible for the largest number of cables tagged OPRC, 1,551, while the American Institute Taiwan in Taipei is behind 1,026 of them. Seventy-five embassies have sent 10 or fewer OPRC-tagged cables.

David Higgerson: ‘Actionable’ news and what it means for data journalism

David Higgerson blogs about the idea of ‘actionable’ news – a phrase that he first heard at last week’s Society of Editors conference from Reuters’ Jodie Ginsberg:

I see actionable news being right at the heart of the idea of data journalism. Information may well be freely available in a way we’ve never seen before, but that doesn’t mean the role of the storyteller has fallen by the wayside. As long as the writer who gets to grips with a spreadsheet of data is also plugged into the community they serve, and knows what they are interested in, then we’ve got actionable news (…) It shouldn’t be a revelation to journalists – after all, newsroom planners have certain data-rich days marked in every year, such as GCSE league tables day. But rather than be dictated to by a government planning calendar, journalists who can marry data access to issues which impact on people’s lives can provide make their work, and the titles they work for, more relevant to an audience than ever before.

Full post on David Higgerson’s blog at this link…

Guardian: Analysing data is the future for journalists, says Tim Berners-Lee

Speaking in response to recent releases of data by the UK government, Tim Berners-Lee, father of the world wide web, says:

The responsibility needs to be with the press. Journalists need to be data-savvy. It used to be that you would get stories by chatting to people in bars, and it still might be that you’ll do it that way some times.

But now it’s also going to be about poring over data and equipping yourself with the tools to analyse it and picking out what’s interesting. And keeping it in perspective, helping people out by really seeing where it all fits together, and what’s going on in the country.

Agree or disagree?

Full story at this link on Guardian.co.uk…