The number of ‘editor’ users of the site is no longer growing, despite an increase in articles, which will make it even harder for the site to monitor quality.
So how can the site, which is part of the infrastructure of the internet according to its founder, adapt and does this shift matter?
Craigslist founder Craig Newmark suggests not: “[F]reed from the inherent flaws of publications dominated by a narrow range of interests, Wikipedia could become ‘more reliable than anything we’ve ever seen’.”
Where have you seen any struggle?
Best to read the full FT article, which talks about how, despite being a pioneer of early online information, Wikipedia is proving slow to push through verification and reliability initiatives.
Craig Newmark (of Craigslist) simply doesn’t get it. Wikipedia’s present problems stem exactly from the fact that they’ve been freed from the constraints of academic fact-checking and editing, and responsible behavior by identifiable and accountable adults. If you think bias is bad in books and newspapapers now, Wikipedia only shows you that bias and lying is far worse for anything produced anonyomously on the web.
Craigslist deals with short atomized contributions which are essentially unchecked ads, and even so are subject to heavy and unrationalized deletion, if others don’t like them, for any reason. So what would Mr. Newmark know about constructing a reliable knowledge base, about anything?