A blog post by the Economist from the South by Southwest (SXSW) event in Texas takes a closer look at a much retweeted tweet: “@robinsloan The way to cover big news in 2011 is not “here’s what happened.” It’s “here’s how to follow the story” http://t.co/sMqGOuh”.
You might say that you don’t need to be a journalist to cobble together a list of links. But actually, given the huge proliferation of sources these days, you do. Being able to scan a vast range of material, determine what’s reliable, relevant and sufficiently objective, decide what will actually interest your particular readers and arrange it in a way that they can use are not trivial skills.
The Guardian played host to designers, developers and journalists at the weekend for its “Guardian Hacks SXSW” event. (The raw data reveals that there were 82 developers, 12 girls and 12 ‘full beards’, among other things.)
Guardian information architect Martin Belam takes a look at some of the day’s hacks on his blog:
The hack that appeared to draw the most gasps from the assembled journalists in the room, and consequently won, was Articlr, which was presented by Jason Grant. It was a back-end tool for easily monitoring social media and rival coverage of a story in real-time, and then simply dragging-and-dropping elements from external sites into a story package. With a bit of geo-location goodness thrown in. I fully expect the feature request to be on my Guardian desk by about 11am this morning…
Leor Galil reviews AOL’s recent experiment in covering the music portion of massive US festival and conference SXSW: AOL offered 2,000 $50-assignments to create coverage of the event for its music site Spinner. The freelancers were recruited via freelance content site Seed.com with the aim of covering all 2,000 bands appearing at the festival.
There’s a certain crassness to AOL’s experiment. The very concept places more weight on quantity vs quality, and the setup undermines the very ideals and democratic nature of web publishing and blogging. With blogging, most bloggers pour their blood, sweat, tears, time and love into a little blog that may not get a lot of hits: many see zero monetary gain. It’s a labour of love, and the best content (or most creative, etc) tends to rise to the top and get noticed. And, one hopes, those who are able to create some fantastic content on a consistent basis can begin to establish themselves online and perhaps make some money for their hard work.
From the ‘process journalism’ session at the SXSW Interactive event in Texas comes a discussion about Wikipedia as a news source. ReadWriteWeb reports:
Just like other news aggregation services, Wikipedia takes many sources and puts them in to a central location, but with the added benefit of human curation instead of algorithmic collection.
“There’s no real-time reporting going on in Wikipedia, it’s real-time aggregation,” Pantages [Moka Pantages, WikiMedia communications officer] said.
So the very first level of information vetting, which happens at the reporting level, has already taken place by the time it reaches the site. Then the hundreds or thousands of editors continue to scrutinize the information, discussing edits and potential changes in the back channels. The news we read in our daily newspapers, on the other hand, is curated by only a small number of people. Surely, there is the question of qualification, but many of Wikipedia’s contributors and editors are, themselves, professionals.
Some great thinkers in media are leading what I’ll call the ‘context movement’, a push toward giving audiences more satisfying, better understanding of the worlds in which they live instead of simply presenting ephemeral, episodic stories as journalists always have.
Matt Thompson, NPR and formerly of the Knight Foundation; Jay Rosen, author of PressThink and professor at NYU; Tristan Harris, CEO/Founder of Apture.
From a speech by Steven B. Johnson, co-founder of Outside.in, at the South by South West Festival:
“I think it’s much more instructive to anticipate the future of investigative journalism by looking at the past of technology journalism (…) It is the old-growth forest of the web. It is the sub-genre of news that has had the longest time to evolve.”