Tag Archives: news site

Essential journalism links for students

This list is doing the rounds under the headline 100 Best Blogs for Journalism Students… and we’re not on it. Nope, not even a smidgeon of link-love for poor old Journalism.co.uk there.

The BachelorsDegreeOnline site appears to be part of e-Learners.com, but it’s not clear who put the list together. Despite their omission of our content and their rather odd descriptions (e.g: Adrian Monck: ‘Adrian Monck writes this blog about how we inform ourselves and why we do it’), we admit it is a pretty comprehensive list; excellent people and organisations we feature on the site, our blog roll and Best of Blogs mix – including many UK-based ones. There were also ones we hadn’t come across before.

In true web 2.0 self-promotional style, here are our own links which any future list-compilers might like to consider as helpful links for journalism students:

And here are some blogs/sites also left off the list which immediately spring to mind as important reading for any (particularly UK-based) journalism students:

Organisations

  • Crikey.com: news from down under that’s not Murdoch, or Fairfax produced.
  • Press Review Blog (a Media Standards Trust project) – it’s a newbie, but already in the favourites.
  • StinkyJournalism: it’s passionate and has produced many high-profile stories

Individuals

  • CurryBet – Martin Belam’s links are canny, and provocative and break down the division between tech and journalism.
  • Malcolm Coles – for SEO tips and off-the-beaten track spottings.
  • Dave Lee – facilitating conversations journalists could never have had in the days before blogs.
  • Marc Vallee – photography freedom issues from the protest frontline.
  • FleetStreetBlues: an anonymous industry insider with jobs, witty titbits and a healthy dose of online cynicism.
  • Sarah Hartley previously as above, now with more online strategy thrown in.
  • Charles Arthur – for lively debate on PR strategy, among other things

Writing this has only brought home further the realisation that omissions are par for the course with list-compilation, but it does inspire us to do our own 101 essential links for global online journalists – trainees or otherwise. We’d also like to make our list inclusive of material that is useful for, but not necessarily about, journalists: MySociety for example.

Add suggestions below, via @journalismnews or drop judith at journalism.co.uk an email.

How Demotix’s contributors have covered Iran election protests

A quick update on the work of pro-am photo agency and news site, Demotix, during this week’s election protests in Iran.

  • On Wednesday Demotix reported that one of its contributors had been arrested. Andy Heath, the site’s commissioning editor, told Journalism.co.uk it is believed the contributor will appear in front of a judge tomorrow [Saturday] and that Demotix is currently seeking more information.

Turi Munthe, its CEO and founder, has made numerous media appearances in which he talked about the use of citizen media during these protests, including the BBC Radio 4 Today Programme, BBC News,  and the World Service. Reuters are also featuring Demotix content.

Munthe said: “In terms of sales, we have also hit a milestone. Reuters is syndicating our content all over the world. Yesterday [Wednesday] we were the lead image on the front page of the Wall Street Journal’s website (see below).”

“Iran is experiencing events not seen since the 1979 Revolution. Demotix was set up precisely to cover and report this kind of event, and we have been at the very centre of the storm.”

wsj

Let the expenses data war commence: Telegraph begins its document drip feed

Andy Dickinson from the Department of Journalism at UCLAN sums up today’s announcement in this tweet: ‘Telegraph to drip-publish MP expenses online’.

[Update #1: Editor of Telegraph.co.uk, Marcus Warren, responded like this: ‘Drip-publish? The whole cabinet at once….that’s a minor flood, I think’]

Yes, let the data war commence. The Guardian yesterday released its ‘major crowdsourcing tool’ as reported by Journalism.co.uk at this link. As described by one of its developers, Simon Willison, on his own blog, the Guardian is ‘crowdsourcing the analysis of the 700,000+ scanned [official] MP expenses documents’. It’s the Guardian’s ‘first live Django-powered application’. It’s also the first time the news site has hosted something on Amazon EC2, he says. Within 90 minutes of launch, 1700 users had ‘audited’ its data, reported the editor of Guardian.co.uk, Janine Gibson.

The Telegraph was keeping mum, save a few teasing tweets from Telegraph.co.uk editor Marcus Warren. A version of its ‘uncensored’ data was coming, but they would not say what and how much.

Now we know a bit more. As well as printing its data in a print supplement with Saturday’s newspaper they will gradually release the information online. As yet, copies of claim forms have been published using Issuu software, underneath each cabinet member’s name. See David Miliband’s 2005-6 expenses here, for example. From the Telegraph’s announcement:

  • Complete records of expense claims made by every Cabinet minister have been published by The Telegraph for the first time.”
  • “In the coming weeks the expense claims of every MP, searchable by name and constituency, will be published on this website.”
  • “There will be weekly releases region by region and a full schedule will be published on Tuesday.”
  • “Tomorrow [Saturday], the Daily Telegraph will publish a comprehensive 68-page supplement setting out a summary of the claims of every sitting MP.”

Details of what’s included but not included in the official data at this link.  “Sensitive information, such as precise home addresses, phone numbers and bank account details, has been removed from the files by the Telegraph’s expenses investigation team,” the Telegraph reports.

So who is winning in the data wars? Here’s what Paul Bradshaw had to say earlier this morning:

“We may see more stories, we may see interesting mashups, and this will give The Guardian an edge over the newspaper that bought the unredacted data – The Telegraph. When – or if – they release their data online, you can only hope the two sets of data will be easy to merge.”

Update #2: Finally, Martin Belam’s post on open and closed journalism (published Thursday 18th) ended like this:

“I think the Telegraph’s bunkered attitude to their scoop, and their insistence that they alone determined what was ‘in the public interest’ from the documents is a marked contrast to the approach taken by The Guardian. The Telegraph are physically publishing a selection of their data on Saturday, but there is, as yet, no sign of it being made online in machine readable format.

“Both are news organisations passionately committed to what they do, and both have a strategy that they believe will deliver their digital future. As I say, I have a massive admiration for the scoop that The Telegraph pulled off, and I’m a strong believer in media plurality. As we endlessly debate ‘the future of news™’ I think both approaches have a role to play in our media landscape. I don’t expect this to be the last time we end up debating the pros and cons of the ‘closed’ and ‘open’ approaches to data driven journalism.”

It has provoked an interesting comment from Ian Douglas, the Telegraph’s head of digital production.

“I think you’re missing the fundamental difference in source material. No publisher would have released the completely unredacted scans for crowdsourced investigation, there was far too much on there that could never be considered as being in the public interest and could be damaging to private individuals (contact details of people who work for the MPs, for example, or suppliers). The Guardian, good as their project is, is working solely with government-approved information.”

“Perhaps you’ll change your mind when you see the cabinet expenses in full on the Telegraph website today [Friday], and other resources to come.”

Related Journalism.co.uk links:

Beatblogging.org: Not-for-profit MinnPost news site raises $13,190 in micro-donations

MinnPost, a non-profit journalism organisations that publishes MinnPost.com, has raised $13,190 after calling on users and supporters to make micro-donations.

While a significant chunk was donated by the Harnisch Foundation, MinnPost users gave $6,595, with more opting to pay $25 than $10 from the two suggested amounts.

This isn’t a long-term business model (and the MinnPost isn’t setting it up as such), but is this a sign that readers will pay for hyperlocal news?

Full post at this link…

Reportr.net: Orato.com ‘turns its back’ on citizen journalism

Alfred Hermida reports that Orato.com has ‘turned its back’ on citizen journalism with a move to more professionalised content.

“Vancouver-based Orato.com used to describe itself as the ‘only news site in the world dedicated to First Person, citizen-authored journalism.'”

Now, however, changes have been made to ‘further professionalise the site, focus its newsworthy content, create and enforce a viable business model and keep pace with Web 2.0 standards,’ says Orato’s founder, Sam Yehia.

Full post at this link…

Radical Press news site merges with Pits ‘n’ Pots

Independent news site for Staffordshire (featured in this Journalism.co.u interview with founder Matthew Taylor) has merged with neighbouring local news site Pits ‘n’ Pots to offer more multimedia coverage.

“The Radical Press following has been growing and growing, but running things alone meant that my time was stretched and expanding the site was difficult,” Taylor, who set up RP in April last year, told Journalism.co.uk

“But with the combination of skills we now have through the amalgamation – the broadcast, photography and technical abilities of the other partners, tied with my print journalism – we have a real opportunity now to create a genuine alternative news site for Stoke-on-Trent.

“The nature of the platform we have now also allows readers to make comments, which means that it is also a forum for debate.”

The new site aims to attract around 3,000 unique visitors per day generating around 10,000 page views, according to a release.

BBC trials News Radar project – how to see the ‘whole’ BBC news site

Interesting first project prototype from the BBC’s journalism labs – the BBC News Radar: a way to track all BBC news stories as they are published/update online.

The tagline? “Monitoring low-flying news since 1998”

As Jake MacMullin points out in a blog post introducing the radar:

“One reason you may not be aware of how frequently we publish new content is that until now there has been no one place you can go to see all of the stories we publish on the news site. You can always monitor the front page or subscribe to RSS feeds of each of the various indexes you might be interested in – but you’ll only see a small sub-set of all of the stories we publish.”

Updates to stories already published are tagged in blue and in a Twitter timeline/Tweme style the automatically updating Radar page shows how long ago the new copy was published:

Screenshot of BBC news radar

An RSS feed would be nice – we’ll leave that as feedback, it’s in beta after all – but it’s a great way of seeing just how much news is being published in real-time.

How to publish Twitter streams on news sites?

socialplumeAs covered earlier on this blog, there are various tools for tracking and engaging in conversations on Twitter, especially where hashtags are used. But how do you publish a themed Twitter stream on your news site or blog, and what other issues are there to consider?

We have experimented with various tools on this blog in order to stream hashtag-themed Tweets (a post on Twitter) into a blog post. The last attempt used a heavily modified WordPress plugin from Monittor. None have been completely satisfactory.

But why would journalists want to do this? Well, imagine if there is an event on your local beat like a football match or other sports game. People are already Twittering from these events. If they could be persuaded to use the same hashtag, then you have the potential of creating a live Twitter stream on your website – a live commentary but from the point of view of several fans, not just one reporter.

Similarly, it could be used to cover breaking news events, basing the Twitter stream on keywords, rather than a hashtag.

For this to work really well though, we decided several functions needed to be in place:

1. The ability to place a stream of Tweets, based on keyword(s) and/or hashtag(s), onto a web page and for that stream to dynamically update (ie not require a page refresh). Ideally the output to be called by <div> tags, rather than a Javascript insert, to cope with content management systems that reject JS in article bodies.

2. Access to legacy Tweets using pagination. The current tools we use only display the last 10 or so Tweets, with no access on our pages to what has been Tweeted before.

3. The ability for administrators to tag certain Tweets within a themed stream and create a new output on another page. The purpose of this is to allow an editor to easily create a summary of the best Tweets for archive purposes.

4. The ability for moderators to manually exclude certain Tweets from a Twitter stream (for moderation purposes).

5. The ability for users to login and post directly to a Twitter stream, from the page on which that Twitter stream is published.

6. Threading based on @replies (probably the most complex proposition in this list).

There did not seem to be any existing tools that covered even half these bases, so we put out a call on a local developer’s email list. Amazingly, it transpired that a local company in Brighton, Inuda, is currently working on a tool that will eventually tick almost all of the above boxes.

Called SocialPlume, the product aims eventually to become a modestly priced subscription service. Jonathan Markwell of Inuda was keen to stress that they are still some way off a public launch, but in the meantime they are keen to hear from publishers and journalists who might be keen to trial the service alongside ourselves. DM @journalismnews or @johncthompson if you are interested.

We would also love to hear other ideas and applications for this service that you might have (please leave a comment).