Report: January 6 investigators confirm that social media platforms “bent their rules to avoid penalizing conservatives” ahead of the insurrection
New reporting proves what Media Matters has shown again and again — social media companies helped set the stage for January 6 and enabled election misinformation to spread unchecked
Written by Spencer Silva
Published
On Tuesday, The Washington Post reported that House January 6 committee investigators found — but ultimately did not publish — mountains of evidence demonstrating that social media companies provided “megaphones” for right-wing extremists ahead of the Capitol insurrection and refused to enforce their own content moderation policies out of fear of conservative backlash. This reporting comes as stolen election lies recently fueled similar attacks in Brazil and former President Donald Trump prepares his return to social media, highlighting the need for platforms to effectively respond to misinformation.
As part of the January 6 committee’s investigation, a group of congressional staffers spent more than a year “sifting through tens of thousands of documents from multiple companies, interviewing social media company executives and former staffers, and analyzing thousands of posts.” The report, as summarized by the Post, concluded that “roughly 15 social networks played a significant role in the attack,” and some platforms even “bent their rules to avoid penalizing conservatives out of fear of reprisal” ahead of the insurrection.
From The Washington Post:
Congressional investigators found evidence that tech platforms — especially Twitter — failed to heed their own employees’ warnings about violent rhetoric on their platforms and bent their rules to avoid penalizing conservatives, particularly then-president Trump, out of fear of reprisals. The draft report details how most platforms did not take “dramatic” steps to rein in extremist content until after the attack on the Capitol, despite clear red flags across the internet.
“The sum of this is that alt-tech, fringe, and mainstream platforms were exploited in tandem by right-wing activists to bring American democracy to the brink of ruin,” the staffers wrote in their memo. “These platforms enabled the mobilization of extremists on smaller sites and whipped up conservative grievance on larger, more mainstream ones.”
Reporting of these findings, which were laid out in an 122-page draft memo that was circulated among the committee, comes just weeks after right-wing protesters in Brazil stormed the country’s presidential palace and other federal buildings fueled by lies about election fraud that spread widely on social media — especially on Twitter, where CEO Elon Musk fired nearly all staff in charge of content moderation in Brazil.
Meanwhile, social media platforms have started to ease their restrictions for Trump that were enacted following the Capitol insurrection. Musk reinstated Trump on November 19 after conducting an unreliable and unscientific Twitter poll. Meta, the parent company of Facebook and Instagram, stopped fact-checking Trump after his 2024 presidential campaign announcement and is expected to decide whether “the risk to public safety” has “receded” enough to allow him back on its platforms. (Trump has not yet posted on his reinstated Twitter account but is reportedly preparing to return to the platform and has petitioned Meta to reinstate his accounts.)
The committee’s findings confirm Media Matters’ multitude of reports showing that the world’s largest social media platforms — Twitter, Facebook, YouTube, and Reddit — failed to stem the tide of election lies and far-right organizing washing over their platforms before the Capitol insurrection. The threat of such misinformation and conspiracy theories has only continued to rise.
The committee’s reported findings single out Twitter as a platform that failed to heed internal warnings about potential violence ahead of January 6, 2021. In fact, during a video call the day before the violence at the Capitol, one Twitter employee reportedly argued that “there might be someone getting shot tomorrow” in part because the company had not codified an incitement-to-violence policy that had been drafted by its safety team — seemingly out of fear that it would necessitate deactivating a sitting president’s account.
For years, conservatives have amplified debunked claims of social media platforms’ supposed bias against conservatives, and the committee revealed that “the sheer scale of Republican post-election rage paralyzed decisionmakers at Twitter and Facebook, who feared political reprisals if they took strong action.” One former senior employee of Twitter’s now-skeletal safety policy team testified to congressional investigators that Trump actually “sat above and beyond the rules of Twitter.” According to the Post, employees weren’t even able to view Trump’s tweets using the company’s internal content moderation tools:
Some of what investigators uncovered in their interviews with employees of the platforms contradicts Republican claims that tech companies displayed a liberal bias in their moderation decisions — an allegation that has gained new attention recently as Musk has promoted a series of leaked internal communications known as the “Twitter Files.” The transcripts indicate the reverse, with former Twitter employees describing how the company gave Trump special treatment.
Twitter employees, they testified, could not even view the former president’s tweets in one of their key content moderation tools, and they ultimately had to create a Google document to keep track of his tweets as calls grew to suspend his account.
While other social media companies tout their insufficient measures to contain election misinformation, Musk's Twitter has done just the opposite. In the past few months, the platform’s billionaire owner has restored the accounts of an assortment of right-wing trolls including “Stop the Steal” founder Ali Alexander, who spent the months before his reinstatement calling into question the results of the Brazilian election and recently applauded far-right protesters' efforts to overthrow the country’s government.
In the aftermath of the 2020 election, Media Matters and others reported on the scores of Facebook groups and millions of Facebook users who joined groups and events organized to “stop the steal.” The day before the Capitol insurrection, Media Matters warned that users in private Facebook groups were encouraging each other to break Washington, D.C., gun laws ahead of the planned protests.
Despite Facebook’s bold claim that its employees could not have known “whether what we were seeing was a coordinated effort to delegitimize the election,” the committee’s reported findings confirm “fear of reprisal and accusations of censorship from the political right compromised policy, process, and decision-making” at the company, and it “did not even try” to “grapple with election delegitimization after the election.”
According to the Post, investigators found that Facebook was wary of removing “Stop the Steal” content from its platform because it would have potentially necessitated deactivating the accounts of many conservative influencers and media outlets — an untenable proposition for a company that has given preferential treatment to right-wing media and the platforms’ millions of conservative users.
Former Facebook employees who testified to the committee reported their company also resisted imposing restrictions. Brian Fishman, the company’s former head of dangerous organizations, testified that the company had been slow to react to efforts to delegitimize the 2020 election results.
“I thought Facebook should be more aggressive in taking down ‘Stop the Steal’ stuff before January 6th,” Fishman said. He noted, however, that broader action would have resulted in taking down “much of the conservative movement on the platform, far beyond just groups that said ‘Stop the Steal,’ mainstream conservative commentators.”
He said he did not believe such action “would have prevented violence on January 6th.”
More than two years after the insurrection, Facebook still allows its users to post “Stop the Steal” content and misinformation about the events of January 6. As the company now considers whether to reinstate Trump, perhaps the most prolific poster of misinformation of all, other extremist movements continue to thrive on the platform.
YouTube
Congressional investigators also found that much of the “Stop the Steal” content spreading on Twitter, Facebook, and other social media sites had been originally published on YouTube, which did not ban election fraud claims until December 9, 2020 — and as Media Matters found, the platform did not adequately enforce that policy till it was much too late. From The Washington Post:
The investigators also wrote that much of the content that was shared on Twitter, Facebook and other sites came from Google-owned YouTube, which did not ban election fraud claims until Dec. 9 and did not apply its policy retroactively. The investigators found that its lax policies and enforcement made it “a repository for false claims of election fraud.” Even when these videos weren’t recommended by YouTube’s own algorithms, they were shared across other parts of the internet.
“YouTube’s policies relevant to election integrity were inadequate to the moment,” the staffers wrote.
While some conservative influencers have left for alternative platforms with virtually no content moderation, like Rumble, YouTube has continued to struggle with election misinformation. During the 2022 midterm election cycle, Media Matters found that YouTube was filled with videos — in both English and Spanish — that featured conspiracy theories about stolen elections and other election misinformation that seemed to violate the company’s content moderation policies.
The draft report also calls out Reddit for dragging its heels before banning subreddit “r/The_Donald,” the now defunct pro-Trump forum that featured a range of bigoted content and calls to violence before it shuttered in June 2020.
The subreddit first spent almost a year “quarantined,” a status that didn’t prohibit its moderators from promoting other fringe subreddits with links to QAnon, Proud Boys, and other far-right causes. Crucially, that quarantine period allowed r/The_Donald’s moderators to create backup forums off of Reddit — including TheDonald.win, which according to the draft report is where “users discussed constructing the gallows that stood ominously in front of the Capitol the day of the attack.”