Your teacher might not trust Wikipedia’s crowdsourced encyclopedia, but it looks like Facebook does. The social media giant thinks the solution to its fake news problem maybe in Wikipedia’s digital pages.
Starting today, Facebook is testing a way to give the articles that appear on your News Feed additional context—helping people figure out which stories are based on fact and can be trusted and shared. The experiment represents a start to Facebook’s battle against fake news.
Now, when Facebook users see an article on their News Feed they can click on a little “i” button, surfacing a description about the publication pulled from the publisher’s Wikipedia page. The social network will also show the publication’s Facebook page as well as trending articles or related news about the topic. The company hopes that seeing a publisher’s Wikipedia page will help readers discern whether or not they trust the source of the article. Like if you read a story that says watching NASCAR causes cancer and the source is a Russian llama farmer, perhaps you’ll think twice about how credible it is.
As for why Facebook thinks you should trust Wikipedia, well, the site uses human moderators as well as algorithms to do their best to keep fake news, misleading stories, and outright lies from filling their pages. According to Wikipedia, 133,540 editors (read: humans) have taken it upon themselves to edit entries on the platform at least once every 30 days, helping stem the potential tide of abuse, falsities, and generally fake news (…er, mostly). Now, the only question is whether Mark Zuckerberg will make a donation to Wikipedia to stop those incessant pleas for money