Meta’s fact-checking partners push back on 'censorship' accusations
Meta currently works with 10 independent fact-checking groups to monitor content on its platforms in the United States.
Independent fact-checking groups that have worked with Meta expressed disappointment this week at CEO Mark Zuckerberg’s Jan. 7 announcement that Facebook, Instagram and Threads would no longer be requiring their services. In particular, they’re taking issue with the suggestion, made by Zuckerberg and Meta’s chief global affairs officer Joel Kaplan, that the decision to shift to a Community Notes approach was in part driven by bias and censorship on the part of fact-checkers.
“We've reached a point where it's just too many mistakes and too much censorship,” Zuckerberg said in his announcement. “[Now] we're going to rely on someone reporting an issue before we take action.”
Instead of employing third-party fact-checking groups to monitor content for misinformation, Meta platforms will now employ a Community Notes approach, similar to the one used by X under Elon Musk, which relies on users to flag content they consider to be inaccurate or require more information.
While there is no specific date for when the approach will go into effect, Zuckerberg said that it will be first implemented in the U.S., where 10 independent third-party groups have done fact-checking for Meta for years.
One of the groups, PolitiFact, has been in partnership with Meta since 2016 when the company began to ramp up its fact-checking efforts following criticism over its apparent failure to deal with misinformation leading up to that year’s presidential election.
Aaron Sharockman, PolitiFact’s executive director, has worked with Meta for over eight years. He told Yahoo News the team is disappointed in Meta’s decision to end the program and to frame it as if the third-party fact-checker groups were censoring content on the platforms.
“PolitiFact, nor any of the nine other U.S.-based fact-checking partners, censor or remove content,” he said. “It’s on their own website that Facebook states very plainly that the fact-checking program is not a means to remove or censor content.”
Sharockman and Jesse Stiller, the managing editor of Check Your Fact, another third-party fact-checking group Meta has worked with since 2019, told Yahoo News that fact-checkers were not personally responsible for removing anything off of the Meta platforms.
Instead, the organizations send fact-checking information to Meta, and then Meta created rules and policies on how to apply the fact checks to posts on its platforms.
“The reality is if Meta and Facebook were worried about penalties, well, they could have changed [the rules and policies],” Sharockman said. “That wasn’t in our purview. Our purview was to add information and context to claims online that were viral and contained potentially harmful misinformation.”
“We did not remove posts on our own will,” Stiller told Yahoo News about Check Your Fact’s partnership with Meta. “We had no control over what was taken down or what was removed. Anyone who says that has zero clue on the process.”
Lori Robertson, managing editor of FactCheck.org, another Meta partner, said something similar in her public statement, which she shared with Yahoo News.
“Under the Meta program, we provided links to our articles to Meta,” she wrote. “We did not, and could not, remove content. Any decisions to do that were Meta’s.”
Allegations of bias
Part of Meta’s argument for getting rid of independent fact-checkers was that, as Kaplan put it in his statement, “Experts, like everyone else, have their own biases and perspectives.”
“This is ridiculous,” Stiller, of Check Your Fact, said. “We provided unbiased, fair and balanced approaches to all of our stories. Didn't matter if it was left or right, our goal was to talk about the facts and give people a lens to look through.”
While Check Your Fact is funded by the Daily Caller, a right-wing news outlet co-founded by Tucker Carlson, Stiller insisted to Yahoo News that the fact-checkers are “editorially independent.”
“If we were biased, we wouldn't call ourselves ‘fact-checkers,’” he said.
Angie Holan, the director of the International Fact-Checking Network, another group that works with Meta and reportedly held an emergency meeting in the aftermath of Tuesday’s announcement, explained in her public statement that “the fact-checkers used by Meta follow a Code of Principles requiring nonpartisanship and transparency.”
Sharockman explained that, even if a fact-checker in one of the groups was biased, Meta was working with 10 different teams. So any inconsistencies between the fact-checkers and the groups — if they existed — should make biases obvious and apparent from the get-go.
“If you have different people working independently, reaching the same conclusion, that’s a good, strong signal that the information isn’t biased,” Sharockman said.
A surprise announcement
The announcement came as a surprise to some of the fact-checking groups that work with Meta. Stiller, for example, said he and his team “never expected” Meta to pivot to Community Notes — especially since, he said, 2024 was “the best year” for Check Your Fact in terms of “coverage and traffic, thanks to the program” with Meta.
But not everyone was totally blindsided by the move. Sharockman said he and his team had been wondering if Meta would end the partnership, especially in light of backlash to particular fact-checking efforts in recent years.
“There’s been constant pressure by outside groups on Facebook over this [fact-checking] program,” Sharockman said, citing a 2020 lawsuit filed by Children’s Health Defense, Robert F. Kennedy Jr.’s group, against PolitiFact over a fact check which labeled as “false” a Facebook post by the group claiming that the flu vaccine was “significantly associated” with an increased risk of COVID-19.
Sharockman said he is not opposed to the idea of Community Notes, pointing to Wikipedia’s reliance on a volunteer editor community to fact-check information as a successful example of how this can work. However, he objects to Meta’s decision to make Community Notes the sole fact-checking resource for its platforms without even testing it out first. It also does not help that research has found that X’s Community Notes feature, which was implemented when Musk took over in 2022, isn’t always accurate.
“It’s just not ending the way that I would like,” Sharockman said.