Tag Archives: Web 2.0

Judge bans use of Twitter in Raoul Moat accomplices case

A judge in the case of two men jailed for life this week for helping gunman Raoul Moat reportedly ruled that Twitter could not be used in court.

The Press Gazette and HoldtheFrontPage are reporting that journalists at ncjMedia Ltd, publishers of Newcastle dailies the Evening Chronicle and The Journal, asked to tweet live updates on the trial and verdicts but had their application rejected.

According to HoldtheFrontPage, the judge in the case, Mr Justice McCombe, refused the application because he believed that the interests of justice would be best served by the production of full, balanced reports.

In December, Britain’s most senior judge issued interim guidance stating journalists could use Twitter in court but that approval must be issued by a judge on a case-by-case basis.

The week before the the Lord Chief Justice issued the guidance, the district judge overseeing the second bail hearing of WikiLeaks’ founder Julian Assange gave journalists and others explicit permission to tweet the proceedings.

The Lord Chief Justice is due to carry out a consultation on the use of Twitter in court reporting shortly.

The headline to this post originally read: Judge bans Twitter despite Lord Chief Justice’s guidance

 

 

 

 

From alpha users to a man in Angola: Adventures in crowdsourcing and journalism

Yesterday’s Media Standards Trust data and news sourcing event presented a difficult decision early on: Whether to attend “Crowdsourcing and other innovations in news sourcing” or “Open government data, data mining, and the semantic web”. Both sessions looked good.

I thought about it for a bit and then plumped for crowdsourcing. The Guardian’s Martin Belam did this:

Belam may have then defied a 4-0 response in favour of the data session, but it does reflect the effect of networks like Twitter in encouraging journalists – and others – to seek out the opinion or knowledge of crowds: crowds of readers, crowds of followers, crowds of eyewitnesses, statisticians, or anti-government protestors.

Crowdsourcing is nothing new, but tools like Twitter and Quora are changing the way journalists work. And with startups based on crowdsourcing and user-generated content becoming more established, it’s interesting to look at the way that they and other news organisations make use of this amplified door-to-door search for information.

The MST assembled a pretty good team to talk about it: Paul Lewis, special projects editor, the Guardian; Paul Bradshaw, professor of journalism, City University and founder of helpmeinvestigate.com; Turi Munthe, founder, Demotix; and Bella Hurrell, editor, BBC online specials team.

From the G20 protests to an oil field in Angola

Lewis is perhaps best known for his investigation into the death of Ian Tomlinson following the G20 protests, during which he put a call out on Twitter for witnesses to a police officer pushing Tomlinson to the ground. Lewis had only started using the network two days before and was, he recalled, “just starting to learn what a hashtag was”.

“It just seemed like the most remarkable tool to share an investigation … a really rich source of information being chewed over by the people.”

He ended up with around 20 witnesses that he could plot on a map. “Only one of which we found by traditional reporting – which was me taking their details in a notepad on the day”.

“I may have benefited from the prestige of breaking that story, but many people broke that story.”

Later, investigating the death of deportee Jimmy Mubenga aboard an airplane, Lewis again put a call out via Twitter and somehow found a man “in an oil field in Angola, who had been three seats away from the incident”. Lewis had the fellow passenger send a copy of his boarding pass and cross-checked details about the flight with him for verification.

But the pressure of the online, rolling, tweeted and liveblogged news environment is leading some to make compromises when it comes to verifying information, he claimed.

“Some of the old rules are being forgotten in the lure of instantaneous information.”

The secret to successful crowdsourcing

From the investigations of a single reporter to the structural application of crowdsourcing: Paul Bradshaw and Turi Munthe talked about the difficulties of basing a group or running a business around the idea.

Among them were keeping up interest in long-term investigations and ensuring a sufficient diversity among your crowd. In what is now commonly associated with the trouble that WikiLeaks had in the early days in getting the general public to crowdsource the verification and analysis of its huge datasets, there is a recognised difficulty in getting people to engage with large, unwieldy dumps or slow, painstaking investigations in which progress can be agonisingly slow.

Bradshaw suggested five qualities for a successful crowdsourced investigation on his helpmeinvestigate.com:

1. Alpha users: One or a small group of active, motivated participants.

2. Momentum: Results along the way that will keep participants from becoming frustrated.

3. Modularisation: That the investigation can be broken down into small parts to help people contribute.

4. Publicness: Publicity vía social networks and blogs.

5. Expertise/diversity: A non-homogenous group who can balance the direction and interests of the investigation.

The wisdom of crowds?

The expression “the wisdom of crowds” has a tendency of making an appearance in crowdsourcing discussions. Ensuring just how wise – and how balanced – those crowds were became an important part of the session. Number 5 on Bradshaw’s list, it seems, can’t be taken for granted.

Bradshaw said that helpmeinvestigate.com had tried to seed expert voices into certain investigations from the beginning, and encouraged people to cross-check and question information, but acknowledged the difficulty of ensuring a balanced crowd.

Munthe reiterated the importance of “alpha-users”, citing a pyramid structure that his citizen photography agency follows, but stressed that crowds would always be partial in some respect.

“For Wikipedia to be better than the Encyclopaedia Britannica, it needs a total demographic. Everybody needs to be involved.”

That won’t happen. But as social networks spring up left, right, and centre and, along with the internet itself, become more and more pervasive, knowing how to seek out and filter information from crowds looks set to become a more and more important part of the journalists tool kit.

I want to finish with a particularly good example of Twitter crowdsourcing from last month, in case you missed it.

Local government press officer Dan Slee (@danslee) was sat with colleagues who said they “didn’t get Twitter”. So instead of explaining, he tweeted the question to his followers. Half an hour later: hey presto, he a whole heap of different reasons why Twitter is useful.

#ds10: Ultraknowledge – search and visualising the news

Why does search have to produce the same set of results that we always get?

One of Andrew Lyons’, commercial director of Ultraknowledge (UKn), opening questions at the Digital Storytelling conference last week as he talked delegates through UKn’s work with the Independent.

The Independent’s NewsWall, launched in January, is a new way of organising stories and navigating through them. It provides a “visual documentation” of a topic and what’s happened in that subject area. (Similar efforts are being made by Daylife’s technology and the Guardian’s News Zeitgeist.

When searched, the wall will return 30 picture-led stories as results, and figures for dwell time on the wall are proving interesting, said Lyons.

The next part will be the ability to save my search for a topic to my Facebook page and then only have it update when it’s relevant to me.

UKn can now start to produce sponsored NewsWalls around events such as the forthcoming World Cup or general election. It will also be opening up the archive of content available through the Independent’s NewsWall from two years to the full 23 years of its history.

UKn has already worked with other publishers to create more intelligent and visually organised search results pages, such as those produced by an initial search on Metro.co.uk.

But the firm wants to take this a step further, by helping news organisations build topic pages for breaking news items by cleverly tagging and organising archived work, and through its latest – and yet-to-be launched project – StoryTriggers – a way to help journalists and news organisations find new leads and spot breaking news trends.

Sometimes the story that you’re after isn’t on your beat, so how do you find it. But when you’re dealing with news its changing, fast – how do you SEO for this? How do you tag it and relate it to what’s happened in the past and what’s happening in the future? (…) We want to be an innovation lab for publishers.

ReadWriteWeb: Wikipedia as a breaking news source

From the ‘process journalism’ session at the SXSW Interactive event in Texas comes a discussion about Wikipedia as a news source. ReadWriteWeb reports:

Just like other news aggregation services, Wikipedia takes many sources and puts them in to a central location, but with the added benefit of human curation instead of algorithmic collection.

“There’s no real-time reporting going on in Wikipedia, it’s real-time aggregation,” Pantages [Moka Pantages, WikiMedia communications officer] said.

So the very first level of information vetting, which happens at the reporting level, has already taken place by the time it reaches the site. Then the hundreds or thousands of editors continue to scrutinize the information, discussing edits and potential changes in the back channels. The news we read in our daily newspapers, on the other hand, is curated by only a small number of people. Surely, there is the question of qualification, but many of Wikipedia’s contributors and editors are, themselves, professionals.

Full post at this link…

The truth about funding investigative journalism 2.0

A proper bit of digging, by the people at online-only news site Business Insider (read about its background here), has led to Nicholas Carlson’s revelations about Mark Zuckerberg and Facebook and as the site says, “startling new information”, about the company’s early days.

But as BI’s Silicon Valley Insider team revealed, this type of work doesn’t make for a sustainable online publication business model. In a flurry of tweets Business Insider editor-in-chief and CEO Henry Blodget explains why (you can view them in a gallery at this link).

It’s important. It’s great. But it is also fantastically expensive and time-consuming.

But the truth is, if we tried to do 3 a day, with our staff, we would DROP DEAD. We’d also go bust. Neither being a happy outcome.

(Hat-tip: The Editorialiste.)

NYC 3.0: Kommons – creating a real-time news platform for specific communities

Vadim Lavrusik takes a look at Kommons, a new project from New York University senior and founder of local news blog NYU Local, Cody Brown.

Brown is developing a platform for local news communities, combining the strengths of Twitter and Wikipedia, to create a flow of real-time news and updates in response to specific questions and queries from users. Kommons will focus on issues of trustworthiness and identity of contributors and be tested on a version for New York University:

A profile page has a stream of status updates, but anything that is @tagged by a user in a status message becomes its own wiki-like page that users can edit and add information to in real-time. This means anything can be tagged and have user-created information attached to it. The function applies to users that are tagged. A user has a short background profile, but other users can attach information about that person. In a lot ways, the identity is determined by the community of users similar to how a bio page functions on Wikipedia.

Full story at this link…