Meta, the Facebook owner, is quietly winding down some safeguards designed to thwart misinformation about voting or foreign interference in US elections as the November midterm elections approach.
It’s a sharp departure from the social media giant’s multibillion-dollar effort to boost the accuracy of posts about the US election and restore confidence from lawmakers and the public after they were outraged by learning that the company tapped into people’s data and allowed lies to override its position during the 2016 campaign.
This thread raises the alarm about Meta’s priorities and how some of the world’s most popular social media platforms could be used to spread misleading claims, launch fake accounts, and provoke the wrath of partisan extremists.
They don’t talk about it,” said former Facebook policy director Katie Harbath, who is now CEO of technology and policy firm Anchor Change. Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They’re holding back, and we don’t know how that will turn out in Midterm elections on the platforms.
Since last year, Meta has shut down an examination of how it amplifies lies in political ads on Facebook by kicking researchers off the site indefinitely.
CrowdTangle, the online tool the company provided hundreds of newsrooms and researchers so they could spot popular posts and misinformation via Facebook or Instagram, is now inoperable some days.
Public communication about the company’s response to election misinformation has been quiet. Between 2018 and 2020, the company issued more than 30 statements detailing how to stifle disinformation about US elections, prevent foreign opponents from posting ads or flyers about voting, and subdue divisive hate speech.
Senior executives hosted question-and-answer sessions with reporters about the new policies. CEO Mark Zuckerberg has written Facebook posts promising to remove false voting information and 1,000 opinion pieces calling for more regulations to tackle foreign interference in US elections via social media.
But this year, Meta has only released a one-page document outlining plans for the fall election, even as potential threats to the vote remain clear. Several Republican candidates are promoting false allegations about the US election on social media. In addition, Russia and China continue to launch aggressive propaganda campaigns on social media aimed at further political divisions among the American public.
Meta says elections remain a priority and that policies developed in recent years around disinformation about elections or foreign interference are now well established in the company’s operations.
“With each election, we integrate what we’ve learned into new processes and create channels to share information with government and our industry partners,” said Meta spokesman Tom Reynolds.
He declined to say how many employees will be on the US Election Protection Project full-time this year.
During the 2018 election cycle, the company provided tours, photos, and produced a number of people for the Election Response War Room. But the New York Times reported that the number of Meta employees working in this year’s election has fallen from 300 to 60, a number that Meta disputes.
Reynolds said Meta will pull hundreds of employees who work on 40 of the company’s other teams to monitor the upcoming vote along with the election team, with an unspecified number of workers.
The company is continuing many of the initiatives it has developed to reduce election misinformation, such as a fact-checking program begun in 2016 that enlists the help of news outlets to investigate the validity of common falsehoods spread on Facebook or Instagram. The Associated Press is part of the Meta fact-checking program.
This month, Meta also launched a new political ads feature that allows the public to search for details on how advertisers target people based on their interests across Facebook and Instagram.
However, Meta has hampered other efforts to identify election misinformation on its sites.
I’ve stopped making improvements to CrowdTangle, a website I’ve introduced to newsrooms around the world that provides insights into popular social media posts. Journalists, fact-checkers, and researchers have used the site to analyze Facebook content, including tracking common misinformation and who is responsible.
The tool is now “moribund”, Brandon Silverman, the former CEO of CrowdTangle, who left Meta last year, told the Senate Judiciary Committee this spring.
Silverman told the AP that CrowdTangle was working on upgrades that would make it easier to search texts of Internet memes, which can often be used to spread half-truths and escape the oversight of fact-checkers, for example.
“There is no real shortage of ways you can organize this data to make it useful for many different parts of the fact-checking community, newsrooms, and broader civil society,” Silverman said.
Not everyone at Meta agreed with this transparent approach, Silverman said. The company hasn’t released any updates or new features for CrowdTangle in over a year, and it’s seen hours of outages in recent months.
Meta has also halted efforts to investigate how disinformation is transmitted through political ads.
The company has indefinitely revoked access to Facebook for a pair of New York University researchers who said they had collected unauthorized data from the platform. The move came hours after New York University professor Laura Edelson said she had shared plans with the company to investigate the spread of misinformation on the platform on January 6, 2021, to the US Capitol, which is now the subject of a House investigation.
“What we found, when we looked closely, is that their systems may have been dangerous to many of their users,” Edelson said.
Privately, former and current Meta employees say the revelations about these risks around the US election have created a public and political backlash for the company.
Republicans routinely accuse Facebook of unfairly censoring governors, some of whom have been fired for violating company rules. Meanwhile, Democrats regularly complain that the tech company hasn’t gone far enough to curb misinformation.
“It’s a very politically charged thing, they’re trying to get away from it more than jump on their heads first.” Harbath, a former Facebook policy director, said. “They just see it as a big old pile of headaches.”
Meanwhile, the possibility of regulation in the United States no longer looms large over the company, with lawmakers failing to reach any consensus about what oversight the multibillion-dollar company should be subject to.
Away from this threat, Meta leaders have devoted the company’s time, money, and resources to a new project in recent months.
Zuckerberg dived into this Facebook rebranding and reorganization last October, when he changed the company’s name to Meta Platforms Inc. Kind of brings the internet to life, rendered in 3D.
His public Facebook page posts now focus on product ads, praises of artificial intelligence, and photos of him enjoying life. Election readiness news is announced in company blog posts that he did not write.
In a post by Zuckerberg last October, after a former Facebook employee leaked internal documents showing how the platform amplifies hate and misinformation, he defended the company. He also reminded his followers that he pushed Congress to update election-related regulations in the digital age.
“I know it is frustrating to see good work being mischaracterized, especially for those of you who are making important contributions in the areas of safety, integrity, research and product,” he wrote on October 5. A term if we keep trying to do what’s right and provide experiences that improve people’s lives, it will be better for our community and our business. ”
It was the last time he discussed the electoral work of the California-based Menlo Park in a public Facebook post.
Associated Press technical writer Barbara Ortutay contributed to this report.
Follow the Associated Press’s coverage of misinformation at https://apnews.com/hub/misinformation.