Explanations here and here on the openpaths.cc website state:
This data represents a unique opportunity to help solve some of the world’s toughest problems. We believe you should have the option of donating your data in an open, secure fashion, while maintaining control of your information and where it goes.
Research requests are received from any and all projects – public, private, commercial, academic, artistic, or governmental. Requests typically look at specific geographical areas or demographic information about their subjects, so research requests include these criteria. Based on this information, users receive monthly updates that list the projects where their data is a good fit, and are offered the opportunity to donate their data.
In return, we ask researchers to provide a small benefit to their data donors. This might be a custom visualization of a donor’s location information, access to the results of the research, or other related benefits.
When researchers revealed that iPhones had been recording location data, concerns were raised about privacy.
Security researchers discovered that Apple‘s iPhone keeps track of where you go – and saves every detail of it to a secret file on the device which is then copied to the owner’s computer when the two are synchronised.
The file contains the latitude and longitude of the phone’s recorded coordinates along with a timestamp, meaning that anyone who stole the phone or the computer could discover details about the owner’s movements using a simple program.
For some phones, there could be almost a year’s worth of data stored, as the recording of data seems to have started with Apple’s iOS 4 update to the phone’s operating system, released in June 2010.
Apple has now released a software update 4.3.3 to fix this. Anyone who wants to make their data available should hold off installing it.
If you are reporting on the referendum on the voting system, the Scottish, Welsh and Northern Irish assemblies or from one of the 305 town halls across England and Northern Ireland with local elections, how are you going to present the results?
As a text only story which reports how many seats have been lost or gained by each party? Or are you going to try visualising the results? Here are five free and easy to use tools to liven up the results.
OpenHeatMap is a way to visualise your results in a map. It is free and very easy to use. You start by creating a spreadsheet, uploading the data and you can then embed the map in your web page.
A. Go to OpenHeatMap (you don’t need a login);
B. Create a spreadsheet. The easiest was to do this is in Google Docs. You must name your columns so OpenHeatMap can understand it. Use ‘UK_council’ for the local council, ‘tab’ for the party and ‘value’ for the number of seats. In this example, the tab column indicates the party with the most seats; the value is the number of seats;
C. Click ‘share’ (to the right hand side of your Google Doc), ‘publish as a web page’ and copy the code;
D. Paste the code into OpenHeatMap and click to view the map. In this example you will see the parties as tabs along the top which you can toggle between. You can change the colour, zoom in to your county or region and alter the transparency so you can see place names;
E. Click ‘share’ and you can copy the embed code into your story.
Anyone can now join Storify (it used to be by invitation only). It allows you to tell a story using a combination of text, pictures, tweets, audio and video.
A. Sign up to Storify;
B. Create a story and start adding content. If you click on the Twitter icon and search (say for ‘local election Kent’) you can select appropriate tweets; if you click on the Flickr icon you can find photos (you could ask a photographer to upload some); you can also add YouTube videos and content from Facebook. When you find an item you want to include, you simply drag and drop it into your story;
C. The art of a good Storify story is to use your skills as a storyteller. The tweets and photos need to be part of a narrative. There are some fantastic examples of story ideas on Storify;
D. Click to publish;
E. Copy and paste the embed code into the story on your site.
C. The video will be automatically posted live to your Qik profile but you’ll need to add the code to your website before you record (you can also live stream to your Facebook page, Twitter account and YouTube channel).
D. To do this go to ‘My Live Channel’ (under your name). Click on it to get your embed code for your live channel.
E. Paste your embed code in your website or blog, where you want the live player to be.
How did you get on with the five tools? Let us know so that we can see your election stories.
Every journalist needs to know about data. It is not just the preserve of the investigative journalist but can – and should – be used by reporters writing for local papers, magazines, the consumer and trade press and for online publications.
Think about crime statistics, government spending, bin collections, hospital infections and missing kittens and tell me data journalism is not relevant to your title.
If you think you need to be a hacker as well as a hack then you are wrong. Although data journalism combines journalism, research, statistics and programming, you may dabble but you do not need to know much maths or code to get started. It can be as simple as copying and pasting data from an Excel spreadsheet.
You can find out more about getting started and trying your hand at complex data journalism at news:rewired – noise to signal, on 27 May. More details about the event are here and you can order tickets, which cost £156 including VAT, by clicking here.
Here are 10 reasons to give data a go.
1. Everybody loves a list. Did you click on this post as you wanted an easy-to-read list rather than an involved article?
2. Everybody loves a map. Try Quantum GIS (QGIS), a free, open source tool, or OpenHeatMap, a fantastic, east-to-use tool as long as your data is categorised by country, local authority, constituency, region or county.
4. Data may need cleaning up. Try using clean up tools like Scraperwiki, which helps non-technical journalists copy a few lines of code to turn a document such as pdf into a number-friendly file like a csv, and Google Refine, which Paul Bradshaw has written some useful posts on over on the Online Journalism Blog.
5. Data of all sorts is increasingly available. The open data movement across the UK is resulting in an increase in the release of data. The possibilities are huge, says Paul Bradshaw on the Guardian’s Datablog. January 2010, saw the launch of data.gov.uk, a fantastic resource for searching for datasets.
6. Data journalism can answer questions. A good place to start in data journalism is to ask a question and answer it by gathering data. Numbers work well. One option is to submit a Freedom of Information request to ask for the numbers. It helps if you ask for a csv file.
7. You can use the crowd. Crowdsourcing by asking a question on Twitter or using a site like Help Me Investigate, an open source tool for people can use to collaborate to investigate questions in the public interest.
8. Data can be personal to every reader. DocumentCloud can highlight and annotate documents to help readers see what is important and learn a document’s back story.
9. “Data journalism is not always presenting the data as journalism. It’s also finding the journalism within the data,” Jay Rosen said in relation to this article on Poynter on how two journalists from the Las Vegas Sun spent two years looking at 2.9 million documents to find out what “what’s right, and wrong, about our local health care delivery system”. The result was that the journalists exposed thousands of preventable medical mistakes in Las Vegas hospitals. The Nevada legislature responded with six pieces of legislation.
10. “Data ethics is just as important as ethics in journalism, in fact they are one in the same,” according to this post on Open Data Wire. Consider the BBC’s FoI request which showed a 43 per cent rise in GPs signing prescriptions for antidepressants and the ethics of unquestioningly relating this to the recession. Ben Goldacre has highlighted the problems with seeing patterns in data.
This is a cross post originally published on the news:rewired website. You can get your tickets here.
Nieman Journalism Lab’s Justin Ellis has written an interesting post on the development of Associated Press’ interactive output, which has nearly doubled over the past two years.
Among other things, Ellis touches on on the work of the AP Interactive department covering breaking news stories with graphics:
The trick in being able to roll out these features so quickly (and likely another reason the department has increased its output) is the usage of templates, Nessa said. That basic form allows the artists, programmers, and others on staff to publish graphics quickly — and to continuously update them as more information comes in from reporters. That’s why when events like Japan’s earthquake and subsequent tsunami hit, you could find not only breaking reports from the AP, in text, but also incredible photography and interactive graphics that harnessed reporting from correspondents as well as accounts and images from on-the-ground witnesses.
Interactives, graphics and visualisation are among a range of essential topics for modern journalists that will be covered at Journalism.co.uk’s upcoming news:rewired conference. See the full agenda at this link.
The New York Times has one of the largest, most advanced graphics teams of any national newspaper in the world. Yesterday at the International Journalism Festival, NYT deputy graphics editor Matthew Ericson led an in-depth two-hour workshop on his team’s approach to visualising some of the data that flows through the paper’s stories every day.
He broke the team’s strategy down in to a few key objectives, the four main ones being:
Explain the geography
Here is some of what Ericson told the audience and some of the examples he gave during the session, broken down under the different headers.
Graphics should bring something new to the story, not just repeat the information in the lede.
Ericson emphasised a graphics team that simply illustrates what the reporter has already told the audience is not doing its job properly. “A graphic can bring together a variety of stories and provide context,” he said, citing his team’s work on the Fukushima nuclear crisis.
The Fukushima nuclear crisis has spurned a lot of graphics work at news organisations across thew world, and Ericson showed a few different examples of work on the situation to the #ijf11 audience. Another graphic demonstrated the process of a nuclear meltdown, and what exactly was happening at the Fukushima plant.
As we approach stories, we are not interested in a graphic showing how a standard nuclear reactor works, we want to show what is particular to a situation and what will help a reader understand this particular new story.
Like saying: “You’ve been reading about these fuel rods all over the news, this is what they actually look like and how they work”.
From nuclear meltdown to dancing. A very different graphic under the ‘desribe processes’ umbrella neatly demonstrated that graphics work is not just for mapping and data.
Disecting a Dance broke down a signature piece by US choreographer Merce Cunningham in order to explain his style.
The NYT dance critic narrated the video, over which simple outlines were overlaid at stages to demonstrate what he was saying. See the full video at this link.
This is perhaps the objective most associated with data visualisation, taking a dataset and revealing the patterns that may tell us a story: crime is going up here, population density down there, immigration changing over time, etc.
These so-called narrative graphics are take what we think of as visualisation close to what we have been seeing for a while in broadcast news bulletins.
The final main objective was to show the audience the geographical element of stories.
Examples for this section included mapping the flooding of New Orleans following hurricane Katrina, including showing what parts of the region were below sea level and overlaying population density, showing where levies had broken and showing what parts of the land were underwater.
It doesn’t take a skilled mathematician to work that one out. But don’t be fooled by it’s simplicity, it underpinned a key message to take away from the workshop. The message is equally simple: graphics and data teams have the skill to make sense of data for their audience, and throwing a ton of data online without adding analysis and extracting a story is not the right way to go about it.
More from Matthew Ericson on the NYT graphics team
I spoke to Ericson after the session about what kind of people make up his team (it includes cartographers!) and how they go about working on a story.
Here’s what he had to say:
The BBC’s Peter Horrocks on data journalism
I spoke to Peter Horrocks, who is director of the BBC World Service and the BBC’s global online news operations after the session about his take on data journalism and whether the BBC Global News had ambitions in this direction.
There has been a range of session formats at #media140, from in-depth keynote speeches and discussion roundtables, to more jam-packed workshops showcasing some of the latest tools in social technology.
Today I attended one of the latter, a session on visualising data by Spanish design house Bestiario.
While it was, in a way, a whirlwind tour of the company’s information processing platform Impure, delegates managed to get a great overview of what it can produce (on my part only with thanks to my translator!)
The focus of the session was not about the written story, but simply visualisation, telling the story with infographics using, in essence, a drag and drop technique.
For a more detailed explanation of how to use the tool you can visit the site itself, but in simple terms the platform enables journalists to create data visualisation projects.
Users can import data files (csv), convert into a table, pull out specific fields, create different data structures and also select from a range of visualisation formats, based on the data you’re working with.
The final visualisations are publicly published on Impure, and users can also embed the infographics on their own site.
At the moment the application is free to use, and the company says there will always be “an open version”, in order to build and maintain a community.
As data visualisation has come into the zeitgeist, and we have started using it more regularly in our story-telling, journalists and designers on the specials team have become much more proficient at using basic spreadsheet applications like Excel or Google Docs. We’ve boosted these and other skills through in house training or external summer schools and conferences.
Journalist and computer scientist Jonathan Stray has posted an interesting breakdown of what he calls “computational journalism”, a kind of parent term for data journalism, visualisation, computational linguistics, communications technology, filtering, research and more.
I’d like to propose a working definition of computational journalism as the application of computer science to the problems of public information, knowledge, and belief, by practitioners who see their mission as outside of both commerce and government. This includes the journalistic mainstay of “reporting” — because information not published is information not known — but my definition is intentionally much broader than that.
Open University lecturer, self-proclaimed mashup artist and all-round bright spark Tony Hirst blogs about a new Q&A site designed to help people with open data questions.
GetTheData.org is in “startup/bootstrapping” phase at the moment but already has a fair bit of information up.
The idea behind the site is to field questions and answers relating to the practicalities of working with public open data: from discovering data sets, to combining data from different sources in appropriate ways, getting data into formats you can happily work with, or that will play nicely with visualisation or analysis tools you already have, and so on.