Wikipedia Got a Letter from the Government the Other Day
A Trump official's risible letter to the Wikimedia Foundation represents a real attempt to intimidate an institution it cannot control
Deep within the Wikimedia Foundation’s Annual Plan for 2023–24, it turns out there was a warning that feels eerily prescient early in Trump’s second term:
As Wikimedia projects are increasingly regarded as trusted sources of knowledge across the world, some politicians and governments have made deliberate efforts to discredit Wikipedia through disinformation campaigns in mainstream and social media.
Last week, Ed Martin, acting U.S. Attorney for the District of Columbia, sent a letter to the Wikimedia Foundation, the parent organization of Wikipedia, announcing an investigation into whether alleged political bias and foreign influence—claimed but not substantiated—violate the Foundation’s obligations under Section 501(c)(3) of the tax code. The WMF now has until May 15 to respond to a dozen questions and produce documents outlining its safeguards against foreign propaganda and ideological manipulation, along with any relationships it maintains with search engines and AI companies.
There are two ways to read this letter. One is to take it at face value, and respond to it point by point. The other is to examine the political and ideological motivations behind its creation. Because it’s not good enough simply to call it ridiculous—you also have to explain why—The Wikipedian will do both.
Just a Light Fisking
The letter opens with a polite, almost disarming tone—an official formality that belies what follows:
To Whom It May Concern:
As the United States Attorney for the District of Columbia, I regularly receive requests for information, clarification, and official comment. I regard such inquiries with the seriousness they warrant and respond appropriately through formal correspondence, such as this letter.
From the very first paragraph, Martin makes clear he’s acting on a tip. But from whom? The Wikipedian has a theory. We’ll come back to that.
It has come to my attention that Wikipedia, which operates via its fiscal sponsor, the Wikimedia Foundation, Inc., is engaging in a series of activities that could violate its obligations under Section 501(c)(3) of Title 26 of the United States Code. As a nonprofit corporation incorporated in the District of Columbia, the Wikimedia Foundation is subject to specific legal obligations and fiduciary duties consistent with its tax-exempt status.
The U.S. Attorney’s Office is not the IRS. It does not oversee compliance with tax-exempt status, unless there’s a credible allegation of criminal fraud. Martin makes no such claim anywhere in the letter.
In addition, the public is entitled to rely on a reasonable expectation of neutrality, transparency, and accountability in its operations and publications.
This statement is half true. The public can expect Wikipedia to pursue neutrality and transparency, because it says so itself. But the notion that falling short of this inherently subjective—not to mention impossible—standard justifies government intervention? Martin seems to have invented it to support the rest of the letter.
In its 2023 IRS Form 990, the Wikimedia Foundation describes its mission as, “empower[ing] and engag[ing] people around the world to collect and develop educational content under a free license or in the public domain and to disseminate it effectively and globally. . . [.]”
Here, Martin is on solid ground. Wikipedia does describe itself as an educational project, and it is indeed written by contributors from around the world.
As you know, Section 501(c)(3) requires that organizations receiving tax-exempt status operate exclusively for “religious, charitable, scientific, testing for public safety, literary, or educational purposes. . . [.]” It has come to my attention that the Wikimedia Foundation, through its wholly owned subsidiary Wikipedia, is allowing foreign actors to manipulate information and spread propaganda to the American public.
But with the phrase “foreign actors,” Martin pivots into something else entirely. The term is meant to raise suspicions. Vladimir Putin is a foreign actor. But then, so is Mr. Bean. And what does it mean to “manipulate information”? That is the very definition of editing. And what “propaganda” is he referring to? Martin doesn’t offer any examples, but still states it as a settled fact.
Wikipedia is permitting information manipulation on its platform, including the rewriting of key historical events and biographical information of current and former American leaders, as well as other matters implicating the national security and interests of the United States.
The word “manipulation” here is slippery. On a basic level, modifying information is the quotidian labors of an online encyclopedia. But in this context, Martin clearly intends the more sinister meaning.
Similarly, the fact that Wikipedia articles are constantly being revised—especially on historical topics or public figures—is inherent in the collaborative process.
Martin’s invocation of “national security” echoes the rationale used in the detention of Palestinian activist Mahmoud Khalil, whose presence in the U.S., the government argues, poses “potentially serious foreign policy consequences.”
Masking propaganda that influences public opinion under the guise of providing informational material is antithetical to Wikimedia’s “educational” mission.
Again, he never says what, exactly, this alleged “propaganda” is. Disappointing, really. You know, Ed, a simple Google search for “wikipedia propaganda” would have turned up the Wikipedia community’s own freely available essay on the topic.
In addition, Wikipedia’s operations are directed by its board, which is composed primarily of foreign nationals, subverting the interests of American taxpayers.
Martin may be correct that the board is majority non-American. But as he acknowledges, Wikipedia is a global project. It’s entirely unsurprising that its leadership would reflect that scope. Yet Martin takes this unremarkable fact and uses it as the basis for a far more provocative claim: that it somehow means Wikipedia is “subverting the interests of American taxpayers”—a leap made without evidence and grounded in nothing more than assertion.
Educational content should be directionally neutral; however, information received by my Office demonstrates that Wikipedia’s informational management policies benefit foreign powers.
Martin’s claim that educational material should be “directionally neutral” is so vague it borders on meaningless. More notably, this marks the second time he refers to an unnamed informant, while implying that some portion of Wikipedia’s billions of words somehow benefits “foreign powers.” It’s true, of course, that accurate information may serve the interests of various governments, especially when it writes about them accurately.
Moreover, we are aware that search engines such as Google have agreed to prioritize Wikipedia results due to the relationship that Wikipedia has established with these tech platforms.
Martin implies that Wikipedia’s prominence in search results stems from some special arrangement with tech companies. This gets it backward. Google and others rank Wikipedia highly because its structure, interlinking, and content quality closely are perfectly designed for search algorithms. Whatever agreements may exist between the WMF and tech platforms in 2025, they follow from Wikipedia’s value as a uniquely useful resource, not the other way around.
If the content contained in Wikipedia articles is biased, unreliable, or sourced by entities who wish to do harm to the United States, search engine prioritization of Wikipedia will only amplify propaganda to a larger American audience.
True, as far as it goes. Given the rest of Martin’s letter, perhaps the only surprise here is this sentence begins with the word “If” and not “Because”.
Lastly, it has come to our attention that generative AI platforms receive Wikipedia data to train large-language models. This data is now consumed by masses of Americans and American teachers on a daily basis. If the data provided is manipulated, particularly by foreign actors and entities, Wikipedia’s relationship with generative AI platforms has the potential to launder information on behalf of foreign actors.
Again, technically true—but far from proven. In fact, Martin understates Wikipedia’s influence on generative AI. Beyond its role in training large language models, Wikipedia is one of the most frequently retrieved sources by consumer-facing chatbots like ChatGPT, Gemini, and Perplexity.
In light of these concerns, my Office seeks information pertaining to Wikimedia’s compliance with the laws governing its tax-exempt status. To assist with our investigation of this matter, I request the following documents and information, covering the period of January 1, 2021 to the present, as soon as possible but no later than May 15, 2025:
For the first time, Martin signals that an investigation has already been opened. And here begins the second half of his letter, a lengthy list of questions and document requests, organized around the WMF’s accountability structures, editorial oversight mechanisms, and information governance practices—all framed to imply a lack of transparency, control, or neutrality.
1. Safeguarding Against Propaganda: What mechanisms does the Wikimedia Foundation have in place to fulfill its legal and ethical responsibilities to safeguard the public from the dissemination of propaganda, particularly in light of its designation as a tax-exempt organization under Section 501(c)(3) of the Internal Revenue Code and its longstanding hands-off policy regarding Trust & Safety (including content moderation and editor misconduct)?
Martin’s question more or less answers itself. The Wikimedia Foundation’s mechanism for safeguarding against misinformation is its Trust & Safety team—part of the legal department—which generally defers to the Wikipedia editor community but steps in under extraordinary circumstances. Perhaps the WMF will simply send him a link to its Trust & Safety page, which outlines these responsibilities, provides contact information, and explicitly references “identifying and countering disinformation campaigns.”
2. Trust & Safety Oversight: Regarding Trust & Safety, what does the Foundation provide in terms of employees and contractors, budget, day-to-day oversight, and enforcement mechanisms, for the purposes of content moderation and addressing editor misconduct (including but not limited to content manipulation, bullying, and off-platform canvassing)?
Martin’s letter does something curious: it shows just enough research to show familiarity with terms like “canvassing”, but never once acknowledges the Wikipedia community—the volunteer editors who actually create and moderate content, who predate the Wikimedia Foundation, and whom the Foundation exists to support. Like many critics before him, Martin directs his complaints at the Foundation rather than, say, the relevant Talk page.
As for the information he seeks, it’s readily available. The WMF’s 2023–24 Annual Plan allocated roughly $20.5 million—or 11.6% of its $177 million budget—to “safety and inclusion” efforts, of which Trust & Safety is a part. Martin likely knows this. His ideological allies, including Elon Musk and the New York Post, recently seized on the word “inclusion” to mischaracterize it as a DEI initiative—which is what led Musk to describe Wikipedia as “Wokepedia” in the first place.
3. Transparency and Donor Influence: How does the Foundation ensure transparency and accountability regarding the extent to which its editorial practices and platform governance are influenced by ongoing relationships with donors, sponsors, funders, or other external stakeholders?
The WMF already discloses extensive financial information, including annual reports, IRS Form 990 filings, and audited financial statements. These documents detail revenue sources, expenditures, and major donors, providing clear visibility into the Foundation’s funding.
In addition, twice a year the WMF publishes Transparency Reports, disclosing requests to alter or remove content or provide nonpublic user information. Each report notes the origin of requests, the content they addressed, and how the WMF responded, carefully balancing its commitments to both privacy and transparency.
4. Foreign Influence Protections: What steps has the Foundation taken to exclude foreign influence operations from making targeted edits to categories of content? Who enforces these measures, and how? What foreign influence operations have been detected, and how were they addressed?
Martin could have saved himself the trouble by reading Wired’s “The Hunt for Wikipedia’s Disinformation Moles” (October 2022), which explains attempts by Russian and Chinese state actors to influence Wikipedia content, and how the community and Foundation responded. In one notable case, the WMF banned seven Chinese nationals in 2021 for, to borrow Martin’s phrase, making “targeted edits to categories of content”. The community itself maintains an article, “List of political editing incidents on Wikipedia”, with further examples.
5. Viewpoint Diversity Policy: What policy does the Foundation have in place to ensure that content submissions, editorial decisions, and article revisions reflect a broad spectrum of viewpoints, including those that may conflict with the views of major financial or institutional backers?
This isn’t the role of the Wikimedia Foundation—it’s the responsibility of the Wikipedia community. Among its three core content policies, the first is neutral point of view, or NPOV, which requires contributors to write “fairly, proportionately, and, as far as possible, without editorial bias”.
The WMF, for its part, receives most of its funding from small-dollar donors but also publishes an annual list of its largest supporters. In 2023–24, this included Apple, Google, and the Alfred P. Sloan Foundation. The Wikipedia article Criticism of Apple Inc. runs to 3,800 words. Criticism of Google clocks in at more than 11,000, making it one of the longest articles on Wikipedia. Meanwhile, the entry for the Alfred P. Sloan Foundation is surprisingly poor, and has carried a warning tag about its lack of sourcing since 2018. While this is mere anecdata, it suggests these donations are doing little to sway Wikipedia’s volunteer editors.
6. Addressing Editor Misconduct: What is the Foundation’s official process for addressing credible allegations that editors or contributors have materially misled readers, engaged in bad-faith edits, or otherwise manipulated content? What is the official process for auditing or evaluating the actions and voting patterns of editors, administrators, and committees such as the Arbitration Committee? Detail all instances in which these processes have been used over the last six years.
The Trust & Safety page includes not one, but two—count ’em, two—flowcharts explaining the process for handling editor misconduct. One outlines how to report an incident; the other explains what steps the WMF legal team may take in response.
Martin also references a bit of Wikipedia arcana: the Arbitration Committee, or ArbCom, often and only somewhat inaccurately called “Wikipedia’s Supreme Court”. The Wikipedian will give Martin this much: a historical analysis of ArbCom decisions could be interesting—though it’s doubtful the WMF has ever undertaken one. It’s even less likely that it has “audited” editor behavior. Audit what, exactly? Requests for Comment? Good luck with that. RfCs aren’t centralized, and there’s no formal archive. However, with a bit of sleuthing, The Wikipedian has identified a bot-generated list of 4,200 RfCs dating to at least 2012—so if anyone wants to try, this is probably not a bad place to start.
As for “voting patterns”, the phrase raised eyebrows on Wikipedia’s Village pump (WMF) discussion board. Some editors pointed out that, as non-Americans, they’ve never voted in a U.S. election—aha, foreign influence! Others noted that Wikipedia’s own “!vote” philosophy emphasizes consensus over simple majority. Given Martin’s word choices elsewhere, which seem engineered to invite conspiratorial readings, it’s little surprise Wikipedia editors are doing so here.
7. Anti-Discrimination Enforcement: Does the Foundation maintain a public, formally adopted policy explicitly prohibiting hateful content and conduct by editors? If so, what enforcement mechanisms are in place to ensure compliance, and how do they apply across different namespaces and content areas?
In 2020, the WMF adopted the Universal Code of Conduct (UCoC), which prohibits hate speech across all Wikimedia projects. It includes enforcement guidelines, a coordinating committee, and a public archive of current and past cases.
Partial credit to Martin’s staff for using the term “namespaces”—though their use of it suggests they may not fully understand what it means. Namespaces aren’t separate projects; they’re technical categories that distinguish between different kinds of pages on the same project. Articles, user pages, and talk pages all occupy distinct namespaces.
Then again, perhaps The Wikipedian isn’t giving them enough credit. Maybe someone in Martin’s office really is worried that enforcement of hate speech policy is more lax in template documentation than media file metadata.
8. Safeguards Against Ideological Manipulation: What safeguards exist to detect and prevent undue influence by ideologically motivated individuals or coordinated networks? Provide details regarding the actions taken by the Foundation over the last six years and any changes made to these safeguards.
We’re repeating ourselves a bit here. Still, it bears restating: this is primarily the responsibility of the Wikipedia community, not the Wikimedia Foundation. Nevertheless, the Trust & Safety team exists, the Universal Code of Conduct is enforced, and the WMF supports the development of tools the community uses to monitor and protect articles from miscreants, malefactors, and meatheads.
In addition to ArbCom and Requests for Comment, Wikipedia editors have venues like the Administrators’ Noticeboard to flag and discuss problematic behavior, such as attempts at coordinated or ideologically motivated editing.
9. Anonymity and Accountability: In view of public criticisms, including those by Wikipedia Co-Founder Dr. Lawrence M. Sanger, regarding editorial opacity, what justification does the Foundation offer for shielding editors from public scrutiny? How does it reconcile this with broader editorial standards requiring attribution and accountability? What measures does the Foundation take to assess the integrity and competence of senior editors and administrators?
OK, now things get interesting. It’s striking to see Ed Martin cite Larry Sanger—Wikipedia’s lesser-known co-founder—who holds the title of co-founder over the objections of its much more famous co-founder, Jimmy Wales. Sanger was originally hired by Wales at his startup, Bomis, to lead the Nupedia project. Wikipedia spun out of that work. After being laid off in the 2002 dot-com crash, Sanger resigned.
Sanger’s preference had always leaned toward Nupedia’s expert-driven model over Wikipedia’s open-editing ethos. A few years later, Sanger launched his first and most successful (but still unsuccessful) Wikipedia alternative, Citizendium, which required contributors to register under their real names. This was Sanger’s first, but not his last, objection to Wikipedia. And that is the context in which Martin invokes him here. But there’s more to the story.
Sanger and Wales originally met on a message board about Objectivism, the radical individualist philosophy of Ayn Rand. Wales’s politics have since moderated to a center-left position, but Sanger’s evolution—or perhaps revelation—has taken him in a hard-right direction. Over the years, Sanger has repeatedly attacked Wikipedia, at one point calling the FBI over concerns about illegal images, and more recently aligning himself with conservative and right-wing figures. He has claimed that Wikipedia has “gone woke”, and is now regularly cited by right-leaning media outlets as the encyclopedia’s chief critic.
In the past few months, Sanger has tagged Donald Trump on social media to suggest an executive order banning federal employees from editing Wikipedia at work. He’s also appealed to Elon Musk, asking that DOGE investigate whether government agencies have hired Wikipedia consultants. Did he also contact Ed Martin? Unless Sanger says so, we’ll probably never know. But there’s a pattern.
10. Prevention of Repeat Offenders: What internal safeguards exist to prevent banned users from creating new accounts and resuming prohibited activities? How does the Foundation address concerns regarding the lack of a robust and transparent process to detect, deter, and exclude repeat offenders?
Plenty—and most of them aren’t the WMF’s responsibility. Still with me? I promise, we’re almost there.
11. Third-Party Contracts: What third-party entities, including AI and LLM companies and search engines, has the Foundation contracted with to use, redistribute, or process Wikipedia content? Please produce all related documents, contracts, amendments, and correspondence.
Technically, the WMF has no obligation to respond to this question. The letter is a request, not a subpoena. The Foundation’s contracts are its own business—and it will be interesting to see how much, if anything, it chooses to disclose.
12. Downstream Content Corrections: When editors or the Foundation delete harmful or illegal content that has already been shared with third parties, what steps are followed to mitigate the downstream effects on search results and LLM training data? What measures ensure that third parties and the broader public are informed of misinformation, bias, or other problems across Wikipedia and related projects?
This is an interesting question... but it’s really one for the AI companies. They’re the ones responsible for how and when their systems update training data or revise outputs based on changes to source material like Wikipedia.
I look forward to your cooperation with my letter of inquiry. Please respond by May 15, 2025. Should you have further questions regarding this matter, please do not hesitate to call my Office or schedule a time to meet in person.
All the best,
Edward R. Martin, Jr.
United States Attorney for the District of Columbia
Thanks, Ed! “All the best”—from the guy threatening to investigate Wikipedia. What a swell guy.
Concern Troll, Esq.
It’s time to just say it: Martin’s letter is a political threat masquerading as a legal inquiry. And it’s no anomaly. It’s part of a coordinated effort by right-wing activists, politicians, and their media enablers, who have grown increasingly hostile to any institution operating beyond their influence. The term “gaslighting” became a buzzword during Trump’s first term, describing manipulative attempts to erode trust in shared reality. It applies here.
From a legal perspective, the letter is untethered from jurisdiction, precedent, and basic constitutional principles. The U.S. Attorney’s Office is not the IRS. There is no law barring non-U.S. persons from contributing to Wikipedia or serving on its board. And several of Martin’s questions appear designed less to gather facts than to chill free speech.
From a Wikipedia perspective, the letter is riddled with errors. It conflates the Wikimedia Foundation’s role as host and steward with editorial control, and implies that any content violating Wikipedia’s own neutral point of view policy might somehow be at odds with federal law.
Martin never states outright why he sent the letter, but The Free Press—a Trump-friendly outlet that got the exclusive—reported that “a person close to Martin” cited Wikipedia’s coverage of the Israel–Hamas conflict. That tracks with a growing pressure campaign aimed at Wikipedia. In recent years, right-wing Jewish groups have criticized Wikipedia’s coverage of Israel, and the site has clashed with the increasingly hardline Anti-Defamation League as well.
Since returning to office, Trump and his allies have escalated a years-long campaign to delegitimize institutions that serve as arbiters of shared reality. Targets include tech companies, media outlets, elite law firms, Ivy League universities, and scientific institutions. While Trump goes after his usual foes—Harvard University, 60 Minutes—Martin has sent similar letters to Georgetown Law School and even to a scientific journal. It was only a matter of time before they came for Wikipedia.
Indeed, elements of the right have been gunning for Wikipedia for years. Besides Sanger’s social media activism, perhaps the clearest precursor to Martin’s letter came in February, when the New York Post editorial board published a jeremiad, “Big Tech must block Wikipedia until it stops censoring and pushing disinformation”, calling for vague but punitive action against Wikipedia. Martin has now stepped up to do exactly that.
Martin is an unlikely banner carrier. As CNN has reported, Martin has had ties to Nazi sympathizers and the white supremacist group VDARE. Donald Trump himself has trafficked in antisemitic tropes and dined with known antisemites.
Martin is also the sort of unlikely public official found throughout Trump’s second administration. A longtime Missouri GOP operative, his brief, chaotic leadership of the late Phyllis Schlafly’s Eagle Forum ended in a lawsuit and allegations of sockpuppetry. He had no prosecutorial background prior to this role. And, ironically—given the concerns raised in his letter—Martin appeared on Kremlin-backed media over 150 times, a fact he failed to disclose in his Senate confirmation questionnaire.
Since taking office, he has pledged loyalty to Trump over his office, floated using his position to go after critics of Musk’s DOGE, defended January 6 rioters, and fired the prosecutors who handled those cases. He also has a bit of a reputation as a Twitter troll. Undermining the polite tone of his letter, Martin posted on X: “Hey @Wikipedia: you can run but you can’t hide!”
The Epicenter of “Fake News”
So why target Wikipedia? Because it embodies everything the MAGA movement resents: expertise, transparency, editorial standards. As others have noted, we are watching a long-running, coordinated effort to dismantle public trust in any institution not fully within MAGA’s grasp. From universities to the press, the judiciary, and now, Wikipedia, the common thread is hostility toward shared reality.
Wikipedia’s popularity makes it threatening. As a project whose mission is to centralize all of the news coverage MAGA hates—and which excludes the bad-faith framing and conspiratorial logic of its favored media—Wikipedia is the center of the “fake news” universe. Wikipedia is also a target because it bridges two worlds it wants to dominate: institutions of knowledge and the public at large. It is, improbably, both elite and populist.
Yet Martin’s attack begins from the position that neither the Wikimedia Foundation nor the Wikipedia community cares about neutrality, accountability, or the public interest. Perhaps this is why it’s so insulting. Wikipedia’s editors and administrators argue constantly. They write, revise, fact-check, and defend their work in the open. The Wikimedia Foundation has entire teams dedicated to safety, privacy, disinformation, legal risk, and community health. The project is far from perfect, but it is constantly being refined by people who care deeply about how knowledge is produced and protected.
Wikipedia operates on consensus. It works because even if participants don’t always agree on what it says, they agree on the values and processes that underpin it. The U.S. political system, we observe in its unraveling, operates more like this than we once appreciated.
A Choice, Not an Echo
The Wikimedia Foundation now faces a defining choice—one of several it has been forced to confront in recent years. Will it bend to a bad-faith political stunt, or stand firm for the values Ed Martin and his allies deride?
Its recent decision to comply with an Indian court order in an unrelated case has raised uncomfortable questions about its willingness to hold the line under pressure. Most Americans don’t track international legal disputes or know much about Narendra Modi. But they do know Donald Trump. And they know Wikipedia.
Few institutions have stood firm through the Trump era. Harvard, now under similar attack, has signaled its intent to fight. Wikipedia has the opportunity—and the obligation—to do the same. All the best, Wikimedia legal team.
He’s not really upset that Wiki has to deal with to insert propaganda. He is just upset that he doesn’t control who is permitted to insert it.