Josh Cohen, senior business product manager for Google, helpfully reminds news publishers that they can stop Google from indexing their webpages by usint the Robots Exclusion Protocol (REP). Publishers can also set a time period for indexing, for example if content goes into a paid archive after a certain time.
Cohen’s comments follow a declaration from the European Publishers Council last week demanding new intellectual property rights protection.
“Some proposals we’ve seen from news publishers are well-intentioned, but would fundamentally change – for the worse – the way the web works,” he writes.
“Our guiding principle is that whatever technical standards we introduce must work for the whole web (big publishers and small), not just for one subset or field.
“There’s a simple reason behind this. The internet has opened up enormous possibilities for education, learning, and commerce so it’s important that search engines makes it easy for those who want to share their content to do so – while also providing robust controls for those who want to limit access.”
Full post at this link…
It’s never going to be a good thing when you have to publish a nine paragraph correction ahead of an article, but the Las Vegas Sun has done just so after admitting ‘several reporting and editing problems as well as some factual errors’ in an article on a teenage shooting.
Part of the problem it seems was the use of online comments from another website to add a racial angle to the piece:
“The problem was that the quotes were anonymous and, because of the way the Web works, could have come from anywhere in the world. Although some people in Summerlin [where the shooting occurred] may hold racist views, these quotes, because of the lack of identity of the writers, in no way proved that possibility.”
The paper has learned its lesson and will be changing editorial policy in the future by not allowing anonymous comments from websites to be used in reporting.