The Top Ten Wikipedia Stories of 2025
As Wikipedia approached the quarter-century mark, 2025 brought political attacks, AI upheaval, and a leadership transition.

It’s time once again for The Wikipedian to carry out its self-assigned task of ranking the ten biggest stories shaping Wikipedia and the Wikimedia movement over the past twelve months. It’s a terribly subjective exercise, of course. Another observer might shuffle the order or swap out a few items based on their own perspective. But as Wikipedia approaches a milestone anniversary—it turns 25 years young next month—there’s little doubt that the uncertainties of technological transformation, unprecedented political pressure, and unbearable weight of global relevance were defining themes in 2025. Let’s do this thing.
10. From Six to Seven
Before things get serious, let’s take a moment to celebrate a milestone that seemed like it might never arrive: in May, the English Wikipedia crossed the seven million article threshold, albeit several months later than projected. In the race to publish article number seven million, the best-guess lucky winner was a 1958 memoir about schizophrenia, although in retrospect the article about a devastating flood might have been a better choice. Seven million articles is no small feat, even with 24 years to do it. For a volunteer project that skeptics once dismissed as an impossible fantasy, it’s worth pausing to acknowledge the sheer improbability of what this movement has accomplished.
9. Governing the Ungovernable
Wikipedia’s governance has always been more miracle than method, and 2025 put that to the test. First, the Wikimedia Foundation (WMF) Board removed two candidates from the community election ballot, prompting the kind of procedural controversy that Wikipedia specializes in. Meanwhile, the Arbitration Committee—the closest thing Wikipedia has to a supreme court—faced the possibility of being short-staffed after its scheduled election initially failed to attract enough candidates. Neither crisis proved fatal, but both lend credence to the old saw that Wikipedia doesn’t work in theory, only in practice.
8. Temporary Accounts Forever
After years of development, the WMF finally rolled out a new feature called temporary accounts, anonymizing IP addresses for logged-out editors. Now casual contributors can update football scores and filmographies without broadcasting their location to anyone who cares to look. The timing is good: as we’ll see in the entries ahead, 2025 was a year in which Wikipedia faced government demands to reveal user information, and volunteer editors were openly targeted for doxxing. Making it marginally harder to identify contributors was the very least the WMF could do.
7. Wikipedia vs. the World
Nothing says you’ve made it like relentless legal challenges, and these days Wikipedia seems to be in court almost constantly.
In India, the defamation case that topped last year’s list dragged on, with the Delhi High Court continuing to pressure the Wikimedia Foundation to identify specific editors. The trend of so-called “age gating” remained in vogue, and the UK’s Online Safety Act moved toward implementation despite WMF protests that it could end up treating Wikipedia like an adult content site.
It wasn’t all bad news: in France, Wikipedia won a pivotal defamation case and the plaintiff will pay its court costs. Whether alarming or merely annoying, Wikipedia’s legal exposure is directly proportional to its international influence.
6. Grok to the Future
In October, Elon Musk’s xAI unveiled Grokipedia, an AI-generated Wikipedia alternative promising freedom from “wokeness”. Early reviews were absolutely brutal, citing its obvious dependence on Wikipedia and the hallucinations that plague all LLMs, alongside some particularly risible biases more glaring than anything Wikipedia has been accused of.
Nevertheless, Grokipedia represents the most credible challenge to Wikipedia’s supremacy since Wikipedia itself dethroned Britannica from its hundred-plus year reign. Perhaps Grokipedia will fade to irrelevance like Wikipedia’s also-rans, but Musk’s outsize wealth, AI technology, and his grievance coalition will keep it afloat for now.
5. AIs Wide Shut
The Wikipedia community took a decidedly oppositional view of generative AI in 2025. It’s not hard to understand why: Wikipedia is built on human judgments about sourcing, accuracy, and neutrality—concepts alien to how LLMs work. And LLMs’ ability to generate content at massive scale could overwhelm a community that reviews everything by hand.
To the rescue comes WikiProject AI Cleanup, started in late 2023 but gaining significant momentum this year. What began as a small group of editors grew into a coordinated, encyclopedia-wide effort to hold back the tide of AI slop. In August, editors added a new speedy deletion criterion for articles obviously generated with minimal human input.
But volunteers also found themselves fending off their own ostensible support system. When the WMF experimented with AI-generated article summaries without consulting them, the backlash echoed past complaints about intrusive top-down impositions on the volunteer project.
As of December 2025, there is still no unified policy on generative AI. The community’s response has been fundamentally defensive: protecting what Wikipedia already is from degradation. It’s a necessary fight, but a reactive one, leaving open the question of how Wikipedia might actually harness this new technology rather than simply resist it.
4. Hoping to Be Eaten Last
If the community saw AI as a threat to be repelled, the WMF saw it as a challenge to be met, and perhaps even an opportunity to be seized.
On the challenge side: AI crawlers now account for up to two-thirds of Wikimedia’s server requests, straining infrastructure while potentially siphoning off human readers, too. On the opportunity side: the WMF released new datasets intended for AI training, pointed companies toward the Wikimedia Enterprise API—a paid tier for heavy commercial users—and launched the Wikidata Embedding Project to make its structured data more AI-friendly.
The logic is reasonable: Wikimedia projects are useful for AI systems, which will use them regardless of WMF’s wishes, so it’s better to cut a deal than get cut out. But critics, especially among volunteer editors, worry this could make Wikipedia handmaiden to its own downfall: normalizing commercial extraction of volunteer labor while weakening editorial norms and community control. CC-BY-SA is not a suicide pact.
The tension between the WMF and volunteer community on AI reflects a genuine disagreement about Wikipedia’s future. The community sees pollution to be filtered; the Foundation sees a new climate to survive. Other entries in this list also describe outside pressures on Wikipedia, but AI is the one where the community and Foundation aren’t fully aligned.
3. Meet the New Boss
In May, Maryana Iskander announced she would be stepping down as CEO of the Wikimedia Foundation. Her departure was abrupt if not shocking—the job is a pressure cooker—but the timing, amid escalating political attacks, left Wikipedia to choose a new captain while navigating choppy waters.
The Board took until December to name her successor: Bernadette Meehan, a former Obama administration foreign policy official with at least one stint on Wall Street and no Wikimedia background. The choice was greeted with a collective shrug, but could still find critics in a community that views polished résumés with skepticism. Meehan’s diplomatic experience could suggest the Board was prioritizing political savvy over movement fluency. Then again, Iskander was also a credentialed outsider and by most accounts did well in the position, so perhaps the Board figured: if it ain’t broke.
The WMF needs a CEO who can handle congressional inquiries, legal battles, and hostile regulators. There’s good reason to think she’ll succeed in that. Whether Meehan can also earn the trust of a volunteer community whose default state is wariness remains to be seen.
2. No Man’s Land
The controversy over Wikipedia’s coverage of the Israel-Gaza conflict, number three on last year’s list, continues apace. And where scrutiny previously focused on alleged bias against Israel, the accusations in 2025 went further: antisemitism, or at least looking the other way.
The naming of the “Gaza genocide” article, formerly “Allegations of genocide”, remained a flashpoint. Critics argued the title prejudged a contested legal and historical question; defenders cited the weight of scholarly and NGO support for the characterization. Jimmy Wales himself waded in, calling the article “particularly egregious” in its framing, publicly second-guessing the community’s consensus, a rarity.
And the stakes rose when a bipartisan group of 23 members of Congress—led by Democrat Debbie Wasserman-Schultz—sent a letter to the WMF expressing concerns about antisemitic content and citing the Anti-Defamation League, with which Wikipedia has repeatedly been at odds.
These accusations have some overlap with the right-wing attacks in this year’s top spot but are not identical. Both question Wikipedia’s neutrality; both seek Foundation intervention in editorial decisions traditionally left to the community. The difference is this critique crosses political lines and includes good-faith concerns alongside bad-faith opportunism.
1. Enemy of the State
If there’s a single story that defined Wikipedia’s 2025, it’s the escalation of coordinated attacks from Trump-aligned Republicans who have decided the free encyclopedia is an enemy to be defeated. For the first time, these attacks are backed by threats of government action, aimed at delegitimizing Wikipedia and intimidating its volunteer editors.
Early in the year, a leaked fundraising appeal from the Heritage Foundation suggested it would unmask anonymous Wikipedia editors, and the Media Research Center took aim at Wikipedia’s sourcing standards. Their cheerleaders at the New York Post editorialized that Big Tech should cut ties with Wikipedia until it started citing sources like the New York Post.
Things escalated when interim U.S. attorney for DC Ed Martin sent a letter to the Wikimedia Foundation demanding documents amid vague accusations of bias and political manipulation by foreign actors, hinting at a review of the WMF’s nonprofit tax status. Martin’s nomination was later withdrawn, but in August the House Oversight Committee picked up the baton, as did Ted Cruz in October on behalf of the Senate Commerce Committee, invoking not just concerns about left-wing bias but the antisemitism question again.
Criticism of Wikipedia is fair game. Like any institution, it has blind spots and points of failure. But these are not isolated critics stumbling upon the same complaints. They are participants in a broader movement that has identified Wikipedia—with its billions of pageviews and unmatched credibility—as an obstacle to be removed.
But what’s truly new in 2025 is the enlistment of government power in an effort to control it. Members of Congress demanding editor identities and activist groups threatening to expose them aren’t engaging in criticism—they’re threatening the conditions that make volunteer participation possible.
The Price of Prestige
If 2025 proved anything, it’s that Wikipedia has become too important to ignore. AI companies need its data. Governments want to know who edits what. Political movements have learned to weaponize accusations of bias. A billionaire launched a competitor. And through it all, the volunteer community and the Foundation that supports it found themselves disagreeing about how to face the future, even as a new CEO prepares to take the helm.
The encyclopedia that anyone can edit has become the encyclopedia that everyone wants to control. Whether that’s a mounting crisis or simply the permanent condition of critical infrastructure may be the defining question of Wikipedia’s next quarter-century.


