Meta says it will finally extend its third-party fact-checking program to Threads
The announcement comes five months after Threads launched without a fact-checking program and told Media Matters it was “considering additional ways to address misinformation in future updates”
Written by Natalie Mathes
Published
On December 12, Meta announced that “early next year” it will extend its fact-checking system to allow its third-party partners “to review and rate false content on Threads,” its text-based platform that launched in July. The announcement, which comes after Media Matters previously highlighted the platform’s content moderation gaps, is a positive step for the company, but it's not clear if it will be enough to prevent the spread of misinformation on Threads.
After Threads debuted in July, Media Matters highlighted that unlike with sister platforms Facebook and Instagram, original content on Threads was not subject to Meta’s fact-checking program. We found that a number of far-right figures signed up for Threads when it launched and began testing the limits of its content moderation policies, posting claims like “migrants make neighborhoods more dangerous,” “the 2020 election was rigged and everyone knows it,” and “transwomen dont exist.” In response to our July reporting, a Meta spokesperson told Media Matters that the company was “considering additional ways to address misinformation in future updates.”
Five months later, as Mashable reported, “This seems like that update”:
Threads already has a hate speech problem, as Mashable's Chase DiBenedetto reported warnings from civil rights groups in July. At the time, a Meta spokesperson told Mashable and Media Matters for America in a statement, “Our industry leading integrity enforcement tools and human review are wired into Threads. Like all of our apps, hate speech policies apply. Additionally, we match misinformation ratings from independent fact checkers to content across our other apps, including Threads. We are considering additional ways to address misinformation in future updates.” This seems like that update, set to roll out “early next year”.
But Meta’s announcement also indicated a key gap in Threads’ moderation policy: Meta will give Threads users in the U.S. the ability to opt out of part of its fact-checking program. In August, Meta gave Facebook and Instagram users the ability to opt out of a misinformation-reduction measure that automatically limited the visibility of fact-checked content in their feeds. The company apparently intends to bring this option to Threads as well, with the December 12 announcement noting, “We’re also bringing these controls to Threads to give people in the US the ability to choose whether they want to increase, lower or maintain the default level of demotions on fact-checked content in their Feed.”
Threads is also adding features that may increase the speed with which content spreads on the platform, potentially increasing the visibility of both accurate and inaccurate information. The company needs to be prepared for the likelihood that misinformation on Threads will have greater reach with these added features.
In September, for example, the platform launched its search function, but deliberately blocked certain “sensitive” keywords, including “covid” and “long covid.” Meta confirmed to The Washington Post that it was blocking certain terms from search until it is “confident in the quality of the results.” But researchers have expressed concerns about blocking such broad terms. “Censoring searches for covid and long covid will only leave an information gap that will be filled by misinformation from elsewhere,” said Lucky Tran, director of science communication at Columbia University.
On December 8, Meta also launched globally its new tagging feature, in which tags appear as blue hyperlinks — without the “#” symbol. Adding these tags to posts is likely to make them more visible, increasing their reach regardless of whether they feature accurate information. These tags, which users may use only one of per post, might also be used to determine trending topics on the platform, another new feature that Theads is reportedly considering. Researchers and journalists have argued that trending topics features are “gameable” and have often contributed to the spread of misinformation. Facing criticism, Facebook removed the “Trending” news box from its website in 2018.
As Meta adds more features to Threads, the platform could become even more vulnerable to the spread of hateful and false content. Enforcing its policies across its platforms is an ongoing challenge for the company, and Media Matters has repeatedly found that hate and misinformation continue to spread on Meta’s other platforms.