Tag Archives: search engines

ACAP answers its critics

The ACAP project launched in November with the hope of being the technological solution to end clashes between news publishers and search engine over content use.

In addition to the back-slapping and the pomp, the launch brought with it hefty criticism of the new system.

The team behind the project has now attempted to satisfy some of the criticism thrown its way by responding to what it considers the main thrust of the argument against it.

Here is a summary of the main critisms ACAP has singled out and its responses (full list):

Criticism: “Publishers should not be allowed to control their content”

Response: Well, you would hardly expect us to agree with this…

“This is simply a way for publishers to “lock up” their content”

…Publishers who implement ACAP will have the confidence to make content available much more widely than is currently the case. Few would condone stealing a pile of newspapers from a newsstand and giving them away to passers-by for free, yet, there are those who think that this behaviour is completely acceptable – indeed normal – in the online environment…

“Robots.txt works perfectly well”

…We recognise that robots.txt is a well-established method for communication between content owners and crawler operators. This is why, at the request of the search engines, we worked to extend the Robots Exclusion Protocol not to replace it (although this posed us substantial problems)… ACAP provides a standard mechanism for expressing conditional access which is what is now required. At the beginning of the project, search engines made it clear that ACAP should be based on robots.txt. ACAP therefore works smoothly with the existing robots.txt protocol…

“This is just about money for publishers”

No: but no one would deny that it is partly about money.

Publishers are not ashamed about making money out of publishing – that is their business…Business models are changing, and publishers need a tool that is flexible and extensible as new business models arise. ACAP will be entirely agnostic with respect to business models, but will ensure that content owners can adopt the business model of their choice…

“The big search engines aren’t involved so don’t waste your time”

Major search engines are involved in the project. Exalead, the world’s fourth largest search engine has been a full participant in the project.

Any lack of public endorsement by the major search engines has not meant a lack of involvement – indeed, quite the opposite…

75 per cent of online publishers see vertical search as way to reclaim online community from Google, survey claims

Nearly three quarters of online publishers see the benefit of developing vertical search engines as a way to claw back online communities from Google, a study published last month has claimed.

E-consultancy – with Convera – conducted a survey of search behaviours with over 500 professional and business internet users.

(Vertical search report – register here to get sent it)

As part of the study it asked 116 online publishers what benefits vertical search would bring.

Benefits of vertical search

Nearly 75 per cent of respondents to the question suggested one advantage of offering vertical search across their websites would be to reclaim online communities from Google. Forty two per cent felt this would be a major benefit.

Nearly 94 per cent of publishers felt that vertical search would also benefit sites through improving authority and enhancing brand awareness.

Keeping users on site (87 per cent) and potential to monetise though advertising (83 per cent) also ranked highly as benefits.

The online publishers felt the major disadvantages of vertical search were the hassle of support and maintenance – 71 per cent of respondents saw it as a downside – and that it may point users toward competitors – 69 per cent.

Why the front page is still relevant

When the incremental overhaul of the Guardian.co.uk enveloped the site’s homepage earlier this year there was much talk of the growing irrelevance of newspaper websites having a ‘front’.

Why a front when so many readers/users/visitors/viewers come in though the side door of search and RSS feeds?

Jeff Jarvis quoted figures that as few as 20 per cent of daily visitors get to see it.

Search engine optimisation – that’s the key isn’t it? With ubiquitous navigation from all parts of the site? Yes, truly it’s important. But is that the case for every user of a newspaper website?

Well, up to a point, Sir – as Mr Salter might say.

Let’s take that magic 20 per cent (I have to apologise for not knowing what this figure actually relates to, but I’ll use it as a starting point rather than a crux). Why would a fifth of daily users want to go in via the front door?

Perhaps they’re not fans of the Google hegemony, so avoid its referrals like the plague? Or not tech-savvy enough to master RSS feeds? Or pretty-much only want news from a single perspective, so rely on just one site as ‘the news’?

But what if accessing the news for them wasn’t as simple as scanning NewsFire or banging a search term into Google and quickly scanning a dozen or so relevant links?

What if navigating all the non-uniform sites linked to from Google News was a cripplingly slow nightmare?

What if the architecture of the sites they visit is as relevant – if not more relevant – than the slant those sites put on the news?

Well, if you’re a blind or partially sighted internet user that’s pretty much how it works.

Over the course of this week Journalism.co.uk is running a series of reports looking at difficulties blind and partially sighted users have accessing leading UK national newspaper websites.

To this end we asked a number of volunteers to show us, first-hand, the common problems they face. During our assessments the value of a homepage became strikingly obvious.

Our volunteers tended to start their internet news searches from the homepage of a favoured news site, rather than a search engine.

Our principal volunteer John Allnutt told us that he tended to glean his news from the BBC News site as it had simple navigation that he was used to using and its accessibility information was easily available.

Nothing so strange in that. Most people have favourites. But the tendency to surf differing sources of news isn’t common, we found, amongst those with visual impairment.

It became clear that once a user had got used to the unique and sometimes esoteric navigation of a news site, using screen reading technology, then logic prevailed. It’s easier and quicker to just go to the site where you know all the idiosyncrasies and curios, rather than getting stuck in the frustrating hamster-wheel of figuring out the complexities of other sites.

Furthermore, many news sites don’t have standardised design throughout, making it harder still to jump into a certain section and expect it to be laid out and navigable in the same way as the rest of the site. Easier then just to enter through the home page and to use that as the fulcrum to all your movements around the site.

Our observation isn’t just limited to the individuals we worked with on the project.

Trenton Moss, director of Web Credible, a web usability and accessibility consultancy that helped us in the early part of the project, told us that this is a common phenomenon.

Blind and visually impaired individuals will continue to use these sites in spite of their flaws he told us, perfecting use of the imperfect navigation of a single or a few sites from the homepage to access news online.

There is no ubiquity of design that would allow the blind and visually impaired user to easily float between news sites and utilise search engines as the easy and quick route to news they want.

Ubiquitous design across a range of news websites isn’t something that’s likely to happen soon, if ever.

It’s because of this that front pages remain important as a point of entry for navigation and an easily accessible summation of all that is important.

@SOE: (Audio) WAN president Gavin O’Reilly slams UK media commentators

Gavin O’Reilly, chief operating officer of Independent News and Media and president of WAN, launched a stinging attack on senior media commentators in the UK while speaking at the Society of Editors meeting in Manchester.

Listen here to some of his concerns and other points of his speech:

Failings of media commentators: too much rhetoric, overly simplistic digital vs print approach…

[audio:http://www.journalism.co.uk/sounds/oreillyone.mp3]

Failures of US print papers, threat of reader apathy biggest threat not internet, real reasons for state of UK market – freesheets and enticements, search engines and ACAP…

[audio:http://www.journalism.co.uk/sounds/oreillytwo.mp3]

Time to clean up your online image

Reuters today reported on the growing need for candidates to not only consider polishing their CVs but also their online image before applying for a job.

With the growing popularity of social network sites – and the use of search engines like Google – employers are using online searches to see if the well-rounded image you portray on your CV is really the case.

As a result, new companies have been emerging into the market providing clean-up services for jobseekers. For a monthly fee, ReputationDefender.com goes beyond search engine sites into the realms of password-protected areas to report back on your online profile.

We here at Journalism.co.uk have searched online for information about candidates, with some very interesting results. However, unlike the 40 per cent of employers reported in this article, it hasn’t been the deciding factor in whether we hire someone.