While the individual newspapers involved in WikiLeak’s latest military document release may be considering lessons for next time, Martin Moore from the Media Standards Trust says all news organisations should be preparing for future waves of data from such sources.
Writing on the PBS Mediashift Idea Lab he says the ‘data dump’ process is likely to to become an increasingly common method of information release as reporters and sources become more experienced in handling such material.
Soon every news organization will have its own “bunker” — a darkened room where a hand-picked group of reporters hole up with a disk/memory stick/laptop of freshly opened data, some stale pizza and lots of coffee.
He proposes five questions for news outlets to consider in preparation for processing leaked material in the best way for the reader, including how to use public intelligence to generate the most stories from material, how to personalise data for their own specific audiences and how to ensure transparency and trust in the publication of documents.
The expenses files, the Afghan logs, the COINs database (a massive database of U.K. government spending released last month) are all original documents that can be tagged, referenced and linked to. They enable journalists not only to refer back to the original source material, but to show an unbroken narrative flow from original source to final article. This cements the credibility of the journalism and gives the reader the opportunity to explore the context within the original source material. Plus, if published in linked data, the published article can be directly linked to the original data reference.
He adds that preparation will be key to securing future scoops, as “organizations that become known for handling big data sets will have more whistleblowers coming to them”.
See his full post here…