"I will give them a billion dollars if they change their name to Dickipedia."
Tech billionaire Elon Musk’s unlikely strike on Wikipedia and the Wikimedia Foundation – a charitable, not-for-profit organisation that relies on donations – last October took most by surprise, not because of the attacker, but the target.
"You can literally fit a copy of the entire text on your phone!" Musk remarked on social media, suggesting that the funds solicited by the non-profit Wikimedia Foundation were used for illicit purposes and not needed for site maintenance.
This was a telling moment, as Musk’s own acquired platform X (formerly Twitter) is still today rife with disinformation, hate speech and endless bot-powered spam, in a year of historic elections.
The match-up is fit for our times. A tech billionaire, who has utterly failed at stopping disinformation on his own platform (or barely even tried), and an online collective resource, powered by a global community of volunteers, working tirelessly in the shadows to ensure people can have access to correct and reliable information online.
The site’s popularity is unquestioned and even became an internet meme about a year ago, when one online user posted about needing a "Wikipedia boyfriend" and someone "whose third monitor is always on some article like Gulf War." The post went viral on social media (and X, ironically).
Yet fighting in the online information space can bring non-profit organisations like Wikimedia to come up against powerful and fast-evolving technologies which are used to peddle disinformation for political ends – whether domestic or international.
'Systems working well'
So in a crucial election year, how does Wikimedia stop disinformation from running rampant?
"Wikipedia works because actual human beings write and protect all the information on the site," Rebecca MacKinnon, Vice President of Global Advocacy at the Wikimedia Foundation, told The Brussels Times. "These volunteers vigilantly defend against information that does not meet the site's policies for what constitutes reliably sourced, encyclopaedic information."
MacKinnon, the author of Consent of the Networked: The Worldwide Struggle For Internet Freedom (2012), is also the co-founder of the citizen media network Global Voices, and founding director of Ranking Digital Rights, a research and advocacy program at New America. From 1998-2004, she was CNN’s Bureau Chief in Beijing and Tokyo.
Today, she is continuing her tireless mission to keep the internet free and democratic through her global advocacy work at Wikimedia. She explains how the content on Wikipedia is created and curated by a community of over 265,000 volunteers from around the world. Content moderation by Wikipedia volunteers is open and transparent and discussed amongst all editors throughout the whole process. All decisions are publicly available on the article's history and talk pages.
In light of the proliferation of fake news online, the Wikimedia Foundation has brought in some countermeasures over recent years. A dedicated Trust and Safety Disinformation team supports global volunteer communities in identifying and countering disinformation campaigns on Wikimedia projects.
During election periods, the role of countering disinformation takes special precedence. Administrators on Wikipedia have the ability to temporarily "protect" a page from modification by less-experienced and new users.
"For instance, during the 2020 US Elections, Wikipedia volunteers protected about 2,000 election-related pages," MacKinnon said. "Restrictions were put in place so that many of the most important election-related pages, such as the one about the US 2020 Presidential Election, could be edited only by the most experienced Wikipedia editors."
Experienced editors also use watchlists to keep track of pages that they are interested in. For the 2020 election, more than 56,000 volunteer editors monitored the election-related pages via real-time feeds.
Contentious topic areas that are more prone to persistent disruptive editing than others are managed by the volunteer-based Arbitration Committee, which has built specific rules. 140 volunteer Wikipedia editors are part of WikiProject Elections and Referendums, established in 2009 to standardise and improve Wikipedia’s coverage of elections.
Ahead of major elections in 2024, a new Disinformation Response Taskforce (DRT) has formed to partner with trusted Wikimedia volunteers and Wikimedia affiliates to identify potential information attacks on Wikipedia.
And it seems to be working. Wikimedia has not uncovered any specific disinformation campaigns, either private or foreign state-driven campaigns in the run-up to the elections.
"As far as we are aware, Wikipedia's content moderation processes and systems are working well and as normal. We have not been alerted to any unusual activity on EU elections-related pages," MacKinnon said.
When asked about Elon Musk’s accusation that the money fundraised by Wikimedia is certainly "not necessary" for Wikipedia, Mackinnon responded that Wikipedia is the only website among the top 10 most-visited global websites that is run by a nonprofit.
Most of the Foundation’s employees are engineers, software developers, and other technical specialists in the Product and Tech department. The Wikimedia Vice-President said that donations also help pay the salaries of her and her team, who engage with policymakers to ensure that laws and regulations in Europe, the United States, and elsewhere protect and support Wikipedia.
"And don't forget our Trust & Safety team and lawyers who defend volunteers against threats by powerful people and entities around the world who would prefer to control what the public can know – or not know – about them," she continued.
Mackinnon said that they are "grateful to generous individuals" from all over the world who give every year to keep Wikipedia freely available and accessible. The majority of its funding comes from donations (€10 is the average) from people who read Wikipedia.
"We are not funded by advertising, we do not charge a subscription fee, and we do not sell user data. This model is core to our values and our projects, including Wikipedia," the vice-president continued. "It preserves our independence by reducing the ability of any one organisation or person to influence the content on Wikipedia."
In 2022, a group of Nobel Peace Prize winners called on world governments to adopt a technology action plan to tackle the "existential threat" to democracies posed by online disinformation, hate speech and abuse.
While Europe agonises over a possible rise of authoritarianism and greater control of the online information space, organisations like Wikimedia are ambassadors to true internet freedom, maintaining access to information via the people's encyclopaedia.