The Top 10 Wikipedia Stories of 2023 (Part 2)
Exploring the most impactful, or at least the most amusing, news and events in Wikipedia and the Wikimedia movement over the past year
If you missed yesterday’s “Part 1”, I recommend clicking here and reading the first installment before returning to this post.
5. Jimmy Wales Takes Another Step Back
To the general public, Jimmy Wales is the King of Wikipedia, but on Wikipedia his influence has receded over the years. In 2023 his standing within the community suffered a potentially irrecoverable setback.
In early April, Wales accused a respected administrator of promoting a long-banned paid editing company, on the basis of a report others later found dubious, and with absolutely no chill: “I have what seems to me a credible report that you have been recommending to people that they use WikiExperts. Is this true? … I am asking you because if so, then you definitely should not be an admin in English Wikipedia. If it is a lie, then fine. But please tell me the truth.”
Wales also didn’t realize the editor had been on a long hiatus after completing service on the Arbitration committee (“Arbcom”). Facing blowback and a potential Arbcom case, Wales finally posted a statement that was close enough to an apology for most, and showed he was serious by requesting that his administrative permissions be removed. His wish was granted, and an old, unused clause in Arbcom policy listing Wales as a one man appellate review board was deleted.
Now User:Jimbo_Wales is just another Wikipedia editor—albeit one responsible for the whole thing in the first place.
4. Revising Polish Holocaust History
Perhaps the most alarming major story from the past year was the discovery that Wikipedia articles concerning the Holocaust in Poland had been distorted by editors with Polish nationalist sympathies, downplaying Polish anti-semitism, overstating Polish heroism, and shifting blame to the victims themselves.
The controversy kicked off in February with the publication of an essay in the The Journal of Holocaust Research titled “Wikipedia’s Intentional Distortion of the History of the Holocaust”. The academics who authored it didn’t capture every nuance of Wikipedia correctly, as The Signpost pointed out, but the case they made could not be ignored: that these editors, largely well-established veterans, had used “dubious sources” and cherry-picked facts to misrepresent Poland’s role in the Holocaust. Among the publications which took notice: Slate, Der Spiegel, and the Times of Israel.
The issue was not unknown to Arbcom, who had heard a precursor case in 2019. But Arbcom’s actions were always limited: its mandate is to adjudicate editor behavior, not article content. But now a credible pattern emerged of content judgments crossing into bad behavior. This time around the committee initiated a case on their own, and in mid-May handed down “topic bans” for three editors, forbidding them from editing pages about Polish WWII history. Additionally, a “reliable sourcing restriction” was passed for the topic, limiting the use of contested sources.
There is no question that other areas of Wikipedia, in English and other languages—as we shall see in #2—are similarly controlled by factions motivated by ideologies beyond Wikipedia’s five pillars. Liberating these topics is possible, but even years of strenuous opposition may not be enough. Sometimes it requires a concentrated effort from a credible outside authority.
3. Foundation Financial Follies
Wikipedia’s annual fundraising banners have arguably overtaken NPR pledge drives as the media’s most familiar, scrutinized, and joked-about fundraising appeals. Wikipedia’s editors take it even more seriously, asking questions for years about Wikimedia Foundation (WMF) fundraising ethics and financial responsibility. In 2023 this long-simmering tension finally had consequences.
In October 2022 a “request for comment” (“RfC”) about the upcoming campaign produced more than 66,000 words, only to arrive at no consensus. But the WMF got the message, and set up a “co-creation page” to seek direct input on appeal language. The campaign wound up running milquetoast banners with language along the lines of: “If you can comfortably afford it this year, please join the readers who donate.” You can guess what happened next: the banners tanked.
In a post on a community forum in January, WMF CEO Maryana Iskander telegraphed changes to come: “Given the reduced revenue from the English campaign, the Wikimedia Foundation has reduced its budget projections for the current year.” Then without so much as a courtesy back-channel to a friendly tech reporter, WMF started making layoffs. By April, The Signpost figured out that approximately 7% of total staff had been let go since the start of the year.
Layoffs rarely feel fair to the laid off, but the logic of nonprofit governance is simple: Wikipedia not only raises a lot of money each year, it also spends almost all of that money each year, and if next year’s money isn’t there, the ax has to fall somewhere. Of course it missed the volunteers who made it happen, because they weren’t receiving a paycheck anyway.
Note: My own firm had a WMF marketing contract (not the COI work we’re best known for, FWIW) come to an abrupt end because of the shortfall. No hard feelings on either side, but better to say so before anyone asks.
2. Putin Squashes Russian Wikipedia
Foreign interference in Wikipedia has made this list several times before, including as the top story in 2015. Some language editions, including Kazakh and Azerbaijani, are understood to be under de facto state control. After years of trying to censor specific pages, China eventually blocked Wikipedia entirely. And the year began with the chilling revelation that two Arabic Wikimedians had been imprisoned for decades, starting in 2020.
Focus soon shifted to Russia, which continued its campaign against Wikimedia Russia (WMRU), repeatedly fining it—ultimately at least ten times—for failing to remove articles about the war in Ukraine. The Russian government used the dishonest phrase “fake news” to describe 133 Wikipedia articles they did not like, and a Kremlin spokesman suggested in April the Russian government should start a competing online encyclopedia.
A month later none other than the WMRU director took him up on the offer, and announced in a blog post that he had created a fork of Wikipedia. The post cited a number of old hat criticisms of Wikipedia as motivation, but no one was fooled—this fork is now the frontrunner to receive state approval—and the now ex-director was “banned indefinitely” from all Wikimedia sites.
Then in December, the replacement WMRU director announced in a very different blog post that he had been forced to resign from his position at Moscow State University after the government informed his employers he was to be named a “foreign agent”, and WMRU subsequently disbanded. This doesn’t mean the Russian Wikipedia will disappear anytime soon—unless Wikipedia pulls the plug on it—but does mean it should no longer be trusted.
1. The Promise and Peril of AI
It takes a lot to beat out a major government destroying its native language Wikipedia edition, but this first full year of wide-scale consumer adoption of artificial intelligence meets the requirements. The topic is so large, it’s a challenge to decide where to begin.
On-wiki, editors have experimented with ChatGPT-written articles, wrestled with how best to use AI, considered whether it should be used at all, and argued that it’s actually not very useful. Questions have been raised about whether readers will abandon Wikipedia entirely and seek information from AI chatbots to an extent that harms Wikipedia. A new warning template was created to warn readers about articles containing LLM content (as of this writing its only usage is on the article Monorail).
Off-wiki, at Wikiconference North America, it was practically all that anyone could talk about. Formal responses have been initiated, with mixed results: efforts to create a new policy about usage of LLM content on Wikipedia has yet to solidify, and likely never will. WMF created a ChatGPT plugin to attribute specific information in answers to Wikipedia, which had a brief moment but then faded quickly. And the New York Times Magazine ran a 6,600 word article asking if Wikipedia is doomed.
As of the end of 2023, we just don’t know. AI could help editors create more content faster, or it could overrun Wikipedia with garbage content, driving readers away—or even pull them away with a better experience. But the current version of AI is none of these: it’s nifty but not uniquely helpful, nor is it threatening in a serious way to Wikipedia’s model. You certainly can’t use it to write Wikipedia articles—who knows what it will make up.
Many have also noted amusing parallels between early responses to ChatGPT and Wikipedia. Both debuted quietly, gained early media attention, became a favorite of students and early adopters, only for the olds to warn that it was not to be trusted. Whether these comparisons will prove to be superficial or not, we’ll just have to see what 2024 and beyond brings.
Thank you for reading to the end, unless of course you immediately scrolled all the way down to see what #1 was first. In which case, thanks for reading anyway! Before I close out this year’s edition, I want to give a shoutout to some who have continued producing Wikipedia journalism while my focus was elsewhere.
First among these is the editorial staff at The Signpost, whose comprehensive coverage of the Wikimedia movement is unrivaled. Stephen Harrison of Slate has waded into more discussion pages and explained more Wikipedia controversies than anyone in the professional news media. And Annie Rauwerda, whose Depths of Wikipedia media brand came out of nowhere to build impressive audiences across multiple channels, has made Wikipedia accessible to new audiences.
In 2024 I look forward to contributing more of my own observations of Wikipedia’s unusual inner-workings, and next week The Wikipedian will return with the next installment of the series “All the News That’s Fit to Post”.
Jimmy Wales photo by Zachary McCune, CC-BY-SA 4.0; Wikimedia fundraising chart by Wikimedia Foundation and Jayen466, CC-BY-SA 4.0; A photo-realistic image representing Wikipedia influenced by artificial intelligence, featuring a multicultural group of Wikipedia editors, by DALL-E and William Beutler
A great wrap-up of a tumultuous year for Wikipedia.