How Wikipedia’s volunteers became the web’s best weapon against misinformation

Interesting article from Fast Company that offers insights on the quality of service volunteers – unpaid staff – can provide:

While places like Facebook, YouTube, and Twitter struggle to fend off a barrage of false content, with their scattershot mix of policies, fact-checkers, and algorithms, one of the web’s most robust weapons against misinformation is an archaic-looking website written by anyone with an internet connection, and moderated by a largely anonymous crew of volunteers…

Wikipedia is not immune from the manipulation that spreads elsewhere online, but it has proven to be a largely dependable resource—not only for the topics you’d find in an old leather-bound encyclopedia, but also for news and controversial current events, too. Twenty years after it sputtered onto the web, it’s now a de facto pillar in our fact-checking infrastructure. Its pages often top Google search and feed the knowledge panels that appear at the top of those results. Big Tech’s own efforts to stop misinformation also rely upon Wikipedia: YouTube viewers searching for videos about the moon landing conspiracy may see links to Wikipedia pages debunking those theories, while Facebook has experimented with showing users links to the encyclopedia when they view posts from dubious websites…

Wikipedia’s lessons in protecting the truth are only growing more valuable.

But all is not well at Wikipedia among the volunteers:

As many of the site’s own editors readily admit in dozens of forums, the community is plagued by problems with diversity and harassment. It’s thought that only about 20% of the editing community is female, and only about 18% of Wikipedia’s biographical articles are about women. The bias and blind spots that can result from those workplace issues are harmful to an encyclopedia that’s meant to be for everyone. Localization is also a concern given Wikipedia’s goal to make knowledge available to the whole world: The encyclopedia currently exists in 299 languages, but the English version still far outpaces the others, comprising 12% of the project’s total articles.

The community has also struggled to retain new blood. Editors often accuse each other of bias, and some argue that its political pages exhibit a center-left bent, though recent research suggests that the community’s devotion to its editorial policies washes that out over time. Less-experienced editors can also be turned off by aggressive veterans who spout Wikipedia’s sometimes arcane rules to make their case, especially around the encyclopedia’s more controversial political pages.

The article’s author took a crack at editing, using WikiLoop Battlefield, a community-built website which lets anyone review a random recent Wikipedia edit for possible vandalism or misinformation. After using it to correct and entry, a few days later, a message popped up on the author’s Wikipedia user page.

“Congratulations,” it read. “You have been recognized as the weekly champion of counter-vandalism of WikiLoop Battlefield.”

For a moment I felt like a hero.

I wonder how many volunteers can say an organization they support made them feel that way.

And for those of you interested in editing Wikipedia – this Wikipedia Cheat Sheet is essential.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.