Tag Archives: European Journalism Centre

European Journalism Centre to offer grants for innovative development reporting

euro notes

By CoreMedia Product on Flickr. Some rights reserved.

Journalists can now apply for grants to help them produce innovative and in-depth coverage of “issues related to global development and the United Nations’ Millenium Development Goals”.

According to a release from the European Journalism Centre, it recently received funding from the Bill & Melinda Gates Foundation which will help it offer up to 30 grants this year.

The press release adds:

The Centre will provide a selection of innovative reporting projects with the necessary funds to enable journalists, editors, and development stakeholders to perform thorough research and to develop entirely new and experimental reporting and presentation methods.

They will also be able to use multi-platform approaches and to think laterally across disciplines and techniques of journalistic storytelling.

Applications for grants should start from at least €8,000, with the centre expecting to give out grants of €20,000 on average. The deadline for applications is 15 March. More details on how to apply are on the grants website.

Hannah McLean, community manager at the European Journalism Centre said it is “looking for new ways of reporting that break outside the lines of the usual story”.

We encourage applicants to use multi-platform approaches and to think laterally across disciplines and techniques of journalistic storytelling.

We want applicants to experiment with new reporting and presentation methods. One of the ways they could accomplish this would be through digital storytelling.

#MozFest – First draft of new Data Journalism Handbook written in 48 hours

The first draft of a handbook to help journalists deal with data has been created this weekend, with plans for it to be published next week.

You can read the table of contents of the Data Journalism Handbook here.

The book was written in 48 hours at the Mozilla Festival in London, with contributions from 55 people, including staff from the BBC, Guardian and New York Times. It has six chapters and 20,000 words and is a response to a challenge set by Mozilla, a nonprofit technology company, to “assemble a utility belt for data-driven journalists”.

The challenge stated:

There’s increasing pressure on journalists to drive news stories and visualisations from data. But where do you start? What skills are needed to do data-driven journalism well? What’s missing from existing tools and documentation? Put together a user-friendly handbook for finding, cleaning, sorting, creating, and visualising data — all in service of powerful stories and reporting.

Jonathan Gray from the Open Knowledge Foundation and Liliana Bounegru, European Journalism Centre hosted sessions at the Mozilla Festival to create the handbook.

A blog post written by Gray lists some of the contributors

Interested in getting started in data journalism? Kevin Anderson is leading an introduction to data journalism one-day training course for Journalism.co.uk in January 2012.

EJC taking responses for data-driven journalism survey

The European Journalism Centre is still collecting responses to its data-driven journalism survey, which will help to inform a future series of training sessions.

The survey, which is being run in collaboration with Mirko Lorenz of Deutsche Welle, features 16 questions asking respondents for their opinion on data journalism, aspects of working with data in their newsrooms and what they are interested in learning more about.

Increasingly, governments, international agencies and organisations such as the Organisation for Economic Co-Operation and Development (OECD) and the World Bank, are publishing online collections of freely available public data. Developing the know-how to use the available data more effectively, to understand it, to communicate and generate stories based on it by using free and open tools for data analysis and visualisation, could be a huge opportunity to breathe new life into journalism. The aim of this survey is to gather the opinion of journalists on this emerging field and understand what the training needs are.

You can find the survey here, with one of the participating journalists to be awarded with a 100 Euro Amazon voucher.

New Brussels-based WikiLeaks spin-off to target EU

This article was first published by the European Journalism Centre and is reposted here with permission.

WikiLeaks spin-off Brussels Leaks launched out of the blue last Thursday to much excitement in the European capital and Twittersphere beyond.

The European Journalism Centre’s exclusive email interview with an anonymous representative is amongst the very first media contact with the fledgling European whistleblower organisation.

Image by quinn.anya on Flickr Some rights reserved.

Why did you feel the need to set up a Brussels/EU focused Wikileaks spin-off? What do you want to achieve?

We have all worked in Brussels for a while and have constantly seen, or heard about, documents floating around which ‘would be great if they could get out in the open’. People didn’t know how to do this most of the time. In our day jobs we did this, using our networks and contacts, but there were a lot of limits. Having personal connections with ‘people in the know’ means their jobs could be on the line if we revealed the information.

Brussels has more than 15,000 lobbyists attempting to have an impact on international decisions. It’s naïve to think things do not happen behind closed doors (such as European President Barroso attending a Plastics lobby dinner – weird?).

This isn’t really for media as much as to help society, and perhaps namely civil society, get their hands on the right information to make their jobs easier.

What do you plan to focus on?

Obviously it’s EU focused which is as broad as you can get. At the moment we try the best with what we get, but obviously anything social or environmental takes priority. We’ll see.

Can you give us a clue as to what leaks, if any, you have in the pipeline?

Transport and energy.

What kind of people have/will leak information on the EU to you? What are their motives?

We meet people all the time working for EU institutions, lobby and industry groups and even NGOs who want to get information out there. They’re often good people who see something they know is wrong, and want to get it known whilst keeping hold of their jobs.

Do you have any direct connections/contact with WikiLeaks? Have they or similar whistleblowing/hacker organisations been in contact with you, or given you advice or assistance?

No, not yet but we are very open to advice and assistance.

What has been the response so far to Brussels Leaks from the institutions/organisations you plan to ‘leak’ information about?

Very quiet publically but we have heard they have at least half an eye on us.

How do your security and technical capabilities match up to the organisations who may try to stop you?

At the moment, it’s hard to tell. We’re not really anticipating in the short-term anything which would put us under the kind of pressure WikiLeaks witnessed, as many of the leaks we have so far are quite low key. This is Brussels after all. Of course we want to build, improve and develop over time – we have a plan and we won’t overstep our capacities.

Is there anything you would not publish?

We are a small group of people who will try to work to a moral code. We’re not interested in gossip or slander. We are doing this because we want to get important information out in the open, but if it looks to endanger somebody, i.e. lives or jobs, then we will not. We also have high level media contacts outside of this who we can refer leaks onto. We’re not here to get publicity, just to get the information out there.

Are any of you journalists?

Yes, all are either journalists or working in communications capacities in Brussels.

What is your code of ethics?

Obviously as we are staying anonymous we need to build credibility and a reputation. We will always be truthful, accurate, and fair and want to hold everything up to public accountability.

What can people do to get involved with Brussels Leaks?

We particularly need technical help, which is always appreciated. Otherwise, we’d just want people to be patient with us. We’re probably not going to bring down EU global diplomacy or anything like that, so we just need time.

ONA Conference 2010: What’s next in online journalism

This article was originally published by the European Journalism Centre. It is reposted here with permission.

The 2010 conference of the Online News Association (ONA) pushed further the debate on how technology is shaping the future of journalism in the evolving web media landscape. The event was held in Washington DC, the US capital, between the 28-30 October, 2010.

Founded in 1999, ONA now has more than 1,600 professional members, both American and international, who are active in the business of gathering, producing and disseminating news through the Internet.

Jane McDonnell – Executive Director Online News Association from European Journalism Centre on Vimeo.

Since its first edition in 2004, the association’s annual conference has been the premier global arena bringing together highly engaged digital journalists, multimedia producers, content editors, technologists, programmers, designers and newsroom decision-makers from major media markets, independent websites and leading academic institutions.

Also this year hundreds of participants converged from all around the world to meet and learn about the latest software and hardware tools for content management, search and distribution platforms, to discuss advancements and challenges in the industry and to network face-to-face in order to share best practices.

After the official inauguration on 28 October, the following two days featured an intensive marathon of thematic sessions where prestigious speakers reviewed the current state of art in all aspects of online journalism.

APIs and social networks: The revolution of news distribution

Day 1 took off with the latest fashion of technology-driven collaborative journalism: ‘Contents-Sharing through APIs’. This was the title of the panel with Delyn Simons, director of platform strategies at Mashery.com, leading provider of customised platforms through which online media can enable third parties to re-use and present their contents in all kinds of new ways, thus expanding visibility and users.

Delyn outlined case-studies of news organisations using Mashery services, such as the New York Times, USA Today and, in particular, the Guardian which has just launched its Open Platform Webfeed. By logging in with a personal API key anybody can access and organise data from the British news daily, and possibly remix them with her/his own data, in order to create original online products for either a personal web platform or the Guardian’s website.

The parallel session ‘Rethinking Online Commenting’, moderated by Alicia Shepard, ombudsman at National Public Radio site NPR.org, discussed newsrooms’ policies for users’ engagement. The same topic was covered in a more technical detail at the panel ‘Social Media Storytelling’ where Zach Seward, social media editor at the Wall Street Journal (WSJ), unveiled the secrets for a successful use of Twitter and Facebook when reporting a story.

“One of the first steps we take is trying to identify what the potential community or audience is. Usually that is as simple as me asking a reporter about groups and existing communities around his or her subject area”, Zachs says, “Then it’s figuring out how to get in front of and be a part of that community. That’s doesn’t mean you have to have Facebook, Twitter or a Digg account for every project or reporter”.

Zachs made the concrete case of the Facebook page created by the WSJ to document a Haitian-American’s mission to rescue his family in Port-au-Prince soon after the earthquake. “Our foreign editor had an idea to tell the story in real time. We thought of the best way to make that happen, and a Facebook page with its status updates seemed to be particularly useful”.

How to preserve news quality in the online environment

Besides enhancing contents distribution, technology can also help improving contents production. One of the most powerful examples is ContentCloud.org, a new open-source semantic-web platform which makes primary source materials easier to scour, annotate and share.

At the panel named after his own company, Jeremy Ashkenas, lead developer at DocumentCloud, showcased a number of investigations conducted by news outlets across the US through using DocumentCloud as a workspace where reporters upload documents, share them with their team and do structured searches and analyses based on extracted entities: The people, places, and organizations mentioned in the text of the documents.

In-depth journalism was also the theme of the panel ‘The New Investigative Journalism Ecosystem’ where Charles Lewis and Kevin Davies, respectively founder and CEO of the new InvestigativeNewsNetwork (INN) explained how the number of global non-profit reporting organizations (many of them INN members) has exploded, from three in 1990 to more than 30 today, and how they use web tools and platforms to collaborate and make public interest journalism available to an increasing number of online users.

But how accurate reporting can survive at a time where journalists can use more and more online sources which are not always reliable? An attempt to answer this challenging question was made by Solana Larsen, managing editor at GlobalVoices, at the panel ‘Tools for Crisis Reporting’.

According to Solana, journalists often belong to two opposite and extreme categories: On the one hand, you have those who rely too heavily on social networks without doing any background checks or speaking with real people; on the other hand, you have those who rely on official sources only and don’t look for unreported local voices scattered across the web.

GlobalVoices platform intends to fill this gap through helping journalists use alternative sources of information in an appropriate way. How? “Unless you talk to somebody who knows well enough the blogosphere of a given country you cannot understand if what is published on a specific blog is representative of a general trend or not”, Solana says, “GlobalVoices aggregates comments on each issue from all local blogs in order to provide a more accurate and diversified picture”.

More HiTech, more news

Day 2 was marked by the panel ‘Ten Tech Trends in ’10’ where Amy Webb, CEO at her own consultancy company Webbmedia Group, highlighted the latest digital tools and their application to online journalism.

Let’s start with what is called Geofencing. “Network mobile applications can now literally locate people in a defined space”, Amy says, “That implies a radical change for hyperlocal journalism. Today people go to a website, type a zip code and get local news. Tomorrow, with Geofence, people can run a mobile app which allows their phone to be identified in a given space and receive automatically news updates related to that specific location. Users will no longer follow the news. The news will follow them anywhere they go”.

Locating people is also possible through Sensor Technology. “Just put sensors in cloths and coffee cups to keep track of everything people are doing”, Amy says, “There are a lot of opportunities for reporting, but also a lot of privacy concerns. Data can be uploaded on the web where reporters can look for them and use them to write their stories”.

Once you have got the information you were looking for, the next step is delivering it to your users according to their specific needs. “Flipboard.com is a dynamic content generation platform which allows users to select twitter feeds, Facebook accounts, and other web sources on their favourite topics and creates automatically paginated online magazines displaying updates on such topics”, Amy says.

The last sessions focused on news apps, including those which help make public data available in a more users-friendly way, tools for data visualization and techniques for video-shooting, which completed the hyper-tech-gallery which already included web design and search engines during Day 1.

Stefano Valentino is an Italian journalist based in Brussels. Since 2008 he has been operating his own EU online customised information service EuroReporter.eu. In 2008 he founded the no profit association Reporters for an Interactive, Cooperative and Specialzied Information (Ricsi).

The EU digital agenda (part I): What is at stake?

This article was originally published on the European Journalism Centre site. It is reposted here with permission.

This is the first of a two-part report on the Digital Agenda Stakeholders Day, an event held by the European Commission in Brussels on 25 October 2010. Part one of The EU’s digital agenda: What is at stake? looks at some of the overarching issues that most areas of information and communication technology (ICT) have in common. Part two (published Wednesday 27 October, 2010) will put the EU’s Digital Agenda into its political context, and will include a review of the actual Stakeholders Day event.

Universal access

Photo: Steve Rhodes on Flickr

The first of the common issues is easy and ubiquitous access to secure and dependable communication networks in the first place.

It is not only internet addicts who suffer from being either disconnected or having only unstable or slow connections at their disposal. Already, many amenities of daily life require you to be online; just think of home banking, online shopping, or real-time news.

But the importance of networks for business and society is even greater. While a private person can still manage offline – albeit increasingly worse – industrial production, transport, trade, banking or political decision-making cannot.

In fact, almost every ‘intelligent’ service requires access to either a comprehensive database, sensors, or supercomputing capabilities, or all of the above: traffic management, on-the-fly speech translations, image recognition or health diagnostics, and that’s just for starters.

It is therefore paramount that the best possible network access is provided literally everywhere at an affordable price; that the quality of the infrastructure does not solely depend on whether building and operating it generates a profit for the respective provider, and that it is always up and running.

However, providing a universal service frequently requires public regulation, as high set-up costs favour monopolistic structures meaning less-densely populated areas would otherwise be left behind.

Network neutrality

The second tenet at stake is network neutrality. Basically, this means that the technical infrastructure carries any information irrespective of its content.

In Internet circles, this is known as the end-to-end principle. It is a bit like public roads which you may use with any type of car, bike, lorry, or as a pedestrian. The street does not care what load you are hauling.

Now imagine if one car manufacturer owned the streets and arranged it so only their models have priority clearing traffic jams or passing traffic lights. Or imagine that transporting some products would be banned because shipping others was more profitable to the road owners.

On the other hand, there are motorways to complement surface roads, and restrictions for their use apply. Slow-moving vehicles and pedestrians are banned in order to speed up transport and render it safer for all who are allowed to participate.

Only few people would really want bicycles on highways. Such is the dilemma of net neutrality: You do not want your provider to slow down Google or BitTorrent to prioritise other services, but at the same time you expect your Skype calls or television programmes to be judder-free no matter what.

As a consequence, net neutrality must follow clear rules. For instance, it must be completely transparent. The customer must know what he/she is getting before signing up for a subscription, and if there is no variety of providers available they must have a choice between different, clearly defined plans.

And while the plan that suits the customer best might be a bit more expensive, it must still remain affordable (see ‘universal access’ above).

Also, any kind of network traffic management that amounts to constrictions of pluralism, diversity and equal opportunities in business or social life is unacceptable, too. Net neutrality regulation must safeguard and support competition on both ends, with network providers and third parties.

Net neutrality is, by the way, also a safeguard against censorship and oppression. Just as the post office is not supposed to read your letters, neither is a technical service provider for Internet access or storage.

The fact that it is pretty easy to monitor content and the path of electronic traffic and to retain telecommunications data does not mean it is all right to do so, irrespective of how tempting it may be, as for instance the German Constitutional Court has ruled. Where necessary, criminal offenses must be investigated at the ends of the communication network, not within it.

A contentious issue in this context are the international ACTA treaty negotiations against counterfeiting of physical products and copyright infringements over the Internet, which may entail that Internet service providers become liable for the content moving through their infrastructure.

In that case providers would be required to closely watch content itself, thus effectively snooping on their customers.

Following earlier criticism by the European Parliament, Trade Commissioner Karel de Gucht recently indicated a more guarded stance of the European Commission in the face of the strict ICT-related regulations demanded mainly by the United States.

Standards and interoperability

The third main factor to be taken into account is interoperability. Remember the time when you could not easily open a document that was created with a Mac on your PC, and vice versa?

While this specific problem has long disappeared, there are myriad other incompatibilities. The traffic updates you find on a website may not work on your particular navigation system; your health record may not be readable once you are abroad; the e-book you have bought with your old reader refuses to appear on your new one; a database that is important for your business cannot be converted into the format you need; and so on.

There may even substantial new barriers be coming up, for instance if Intel adopts Apple’s App Store model to control what kind of software runs on your run-of-the-mill PC.

The huge success of the Internet so far is not least based on its universal standardisation. The same goes in principle for car fuels, the Euro, credit cards, computer operating systems, mobile phone service, and many more. Standards and so-called ‘open APIs’, or easily accessible, transparent interfaces between software solutions or technical appliances, render a single device, website, or application larger than itself because it can interact with others, exchange data, and inspire entirely new uses through innovative combinations of functionality.

Interoperability also encourages competition, allowing users to combine solutions by different manufacturers, or to freely buy third-party peripheral equipment.

Standards must however be agreed upon very carefully, as they may freeze a given state of the art and discourage further development. Only intelligently defined standards are the essence of innovation, dependability, and pervasiveness.

Photo: Manoel Netto on Flickr

Content

A related aspect that could be subsumed under interoperability is the current national fragmentation of markets.

While it has become pretty easy to order physical goods or services across European borders, the same does not hold true for intangible, electronic products such as computer software, or content such as e-books, movies, TV programmes, music, etc.

You can buy a DVD or a book anywhere and bring it back home, but you will rarely be able to legally download that same movie from a website in the very same country. This is not so much a technical problem, but rather a legal and social one – content is still licensed on national level rather than European, and it remains difficult to gain access to different language versions of the same content irrespective of the user’s whereabouts.

Similarly, many cultural items such as books, paintings, sheet music, music recordings, motion pictures, etc. cannot even be accessed domestically (not to mention Europe-wide) since the rights are either unresolved or entirely unaccounted for.

The latter are the so-called ‘orphan works’, which are technically copyrighted but where it is impossible to identify any person who actually holds the rights.

The EU-sponsored Europeana project is a large-scale initiative to overcome these issues by collecting legally cleared digitized cultural content from many (mostly public) Member State organisations or cross-border thematic collaborations, and cross-referencing them by context.

At the same time, online content is increasingly threatened by the Fort Knox problem. Data are aggregated under the auspices of an ever smaller number of large-scale organisations such as Google, Apple, or Amazon, to name only a few.

The infamous example of Amazon deleting because of rights issues, of all things, George Orwell’s novel 1984 from Kindle readers who had stored a legally acquired copy, shows quite alarmingly what might happen. Imagine that one entity could delete all copies of a physical book worldwide at will by a mere mouse click!

However well justified and ultimately inconsequential Amazon’s decision about this particular ebook may have been, the incident just goes to show that invaluable data may be lost forever. This may happen just because a single authoritarian government orders its erasure for political reasons, or because the keeper of the file suddenly turns ‘evil’, experiences a trivial thing as a technical breakdown, or goes bankrupt.

Therefore, content storage and control, particularly of any material that is already in the public domain or destined to go there in the future, must be as widely distributed as possible.

While it is highly laudable for example, that Google systematically scans and stores books from university libraries, none of the participating libraries should let Google hold the only electronic copy of their books.

Security and privacy

In addition to all the above, there are overarching concerns related to security and privacy in the ICT area, and they overlap with the other main tenets – or sometimes even run contrary to them.

Cyber crime and hacker attacks on the infrastructure or individual devices must be combated without compromising the principles of a free network, standards, and interoperability.

Freedom of information must be balanced against the right to privacy, and while the former requires safeguarding that stored data remain accessible, the latter may even entail that information gets intentionally deleted for good.

Security of supply and integrity of the infrastructure need technical provisions which may be at odds with commercial or law enforcement interests. Online communications of importance and sensitive data transfers must be trustworthy and authentic.

Spam, viruses and other nuisances must be neutralised – all without rendering ICT networks and components too inconvenient and cumbersome to use. The list goes on.

Please return for the second installment of this report (published Wednesday 27 October, 2010), where I discuss the Digital Agenda’s background in the European Union’s policy. Part two will be accompanied by a downloadable summary of the actual Digital Agenda Stakeholders Day.

Related articles on Journalism.co.uk:

The campaign to repeal the Digital Economy Act and why journalists should pay attention

Campaigners call for ongoing protest against Digital Economy Act

#picnic10: Watch the Future of Journalism session live

The European Journalism Centre (EJC) has a great line up of speakers for today’s PICNIC conference – “a renowned festival-cum-conference that blurs the lines between creativity, science, technology, business and society”.

The programme features a keynote speech from Jeff Jarvis, director of the interactive programme at City University of New York’s Graduate School of Journalism, hot on the heels of CUNY’s new entrepreneurial journalism plans.

Mark Glaser, executive editor of MediaShift, will focus on the successes and failures of traditional media when it comes to digital; while new City University London lecturer Paul Bradshaw will set out a journalism curriculum for the 21st century.

The full programme is available via the EJC’s event page and you can watch a live stream of the day’s events below:

ejcnet on livestream.com. Broadcast Live Free

Amateur media watchdogs helping keep newspapers in check

While a handful of established groups shoulder the responsibility of holding news and media organisations to account, the internet has fueled the growth of the individual online watchdog, according to an interesting post on the European Journalism Centre website.

Author Jamie Thunder, an Investigative Journalism MA student at City University uses several examples to illustrate the biggest media bloggers within the online community, such as Tabloid Watch, Five Chinese Crackers, Angry Mob and Enemies of Reason.

‘Watchdog’ groups are nothing new to the media. But these blogs are different. There’s no unifying political ideology, and they’re maintained alongside full-time jobs. They’re not run by media theorists or political activists – just individuals stirred to action by the daily iniquities of the press.

He says that while they accept their impact on the papers themselves will be minimal, it’s the online “groundswell” among readers which is where their power lies.

We all know the media landscape is shifting, and shifting fast – paywalls, user-generated content, and Wikileaks are just three recent developments. Yet little has been said about the increasing ability for non-journalists to analyse and publicise the press’s problems (…) And as long as newspapers keep misbehaving, they’re not going away.

See his full post here…

Notes from an award-winning blog: the Brit who scooped the European prize

Last week Etan Smallman won the first ever European blogging conference, the European Journalism Centre’s ‘Th!nk About It’ competition.  Etan blogs at studentjournalist.wordpress.com. Photos from the finale can be viewed on his Flickr stream at this link.

For a country that is sneered at by almost all of its European neighbours for its remote, aloof and imperious attitude towards the European Union, I, a humble Brit, was pretty proud (not say totally shocked) to be crowned the winner of the first ever European blogging competition at its finale in Rotterdam this week.

What’s more, I was by no means the only Brit to triumph at the awards ceremony of ‘Th!nk About It,’ a competition that aimed to get young people talking about the European elections that took place to almost no other fanfare at all earlier this month. In all, four out of the five British participants took home awards – not bad for a country that was derided as ‘ignorant’ at the very same event.

When I sent off a brief email in December to apply to take part in the first project of its kind – a pan-European contest that I dubbed ‘the blogging world’s Eurovision song contest’ – I had very little idea of what I was letting myself in for. Five months, and 39 self-penned blog posts later, to my surprise and delight, I have won the entire competition, beating 80 other competitors from all 27 EU member states – and collecting a top of the range Mac laptop for my efforts.

In January, we all assembled in Brussels for a free trip to meet each other and launch the contest – organised by the European Journalism Centre (EJC), and part funded by the European Commission. They weren’t doing things by halves, with speakers including the BBC’s venerated Europe editor, Mark Mardell, and the FT’s Brussels bureau chief, Tony Barber.

Four-and-a-bit months on in Rotterdam, there was a mood of celebration. Wilfried Rütten, director of the EJC, said that the competition had achieved so much, he was embarrassed by its success. The EJC said it did not have any expectations at the outset and that the project had helped engage young people in European politics.

But aside from the back-slapping and self-congratulatory Euro-love on display in Rotterdam, how successful has the project actually been? This is where it gets tricky. The hard numbers are certainly impressive; these are a few that have been bandied about:

  • Nearly 600 blog posts
  • 2,316 comments
  • Around 5,000 trackbacks from external websites
  • Over 2.7m hits
  • 14,000 Google links

However, the original figure of 81 bloggers taking part is actually one of the most damning. Despite a higher than 1 in 3 chance of coming away with a prize (ranging from iPhones to laptops and Flip cameras) – and two free trips on offer – a significant minority lost interest as soon as they returned to their home countries. Is that a desperate indictment of the EU and its ability to relate to its citizens? I’m inclined to conclude that it is more of a comment on the level of interest and commitment shown by some, who failed even to complete the minimum of one blog post per month to remain in the competition.

Turnout at the EU elections was horrendously low; but even the most ardent new media enthusiasts would be unlikely to claim that blogging should have changed that. A more important question is how many people from outside the Euro bubble actually popped their ear up against out blogging wall. My fear is that we were just an echo chamber; albeit a large, active and impressively innovative one.

However, compared to numerous EU inititatives in the new media arena, Th!nk About It was a roaring success. Its 2.7m or so hits in four months compare extremely favourably to the EU’s public flop of a European television station, EUTube, which notched up a dismal 2.2m viewers in the two years since its launch.

Charlie Beckett, director of media think tank POLIS, criticised the project several months ago: “Irrelevant of new media, I don’t think it is fair to expect bloggers talking in different languages in different media markets to cross boundaries and change political climates,” he said.

That is perhaps true, but nonetheless, actually getting representatives from every EU country talking together – and about the EU, of all things – is undoubtedly an ambitious start. And it does seem that this is only the start, as a quick glance at the current website will attest.

The site has undergone a quick re-brand since the awards ceremony, and the competition is now branded: ‘Round #1’. This could be just the beginning…

CMS2009: Live no-refresh updates: Twitter chat for MediaGuardian Changing Media Summit 2009

Today is the MediaGuardian’s Changing Media Summit 2009. You can find the programme at this link.

There are a few Twitterers about – possibly including the event’s chair @ruskin147 aka BBC technology correspondent Rory Cellan-Jones although he lost out on his Twitter hashtag of choice.

#GCM has been shunned for #changingmedia. You can follow tweets via @journalism_live.

The European Journalism Centre is trying CoverItLive for the first time:
visit the conversation here, at this link.

Twitter conversation tracked here. Follow this post with no need for refresh…