Bloomberg: Meta Pulls Support for Tool Used to Keep Misinformation in Check

Bloomberg: Meta Pulls Support for Tool Used to Keep Misinformation in Check

On May 17, as several states held their primary elections, Jesse Littlewood searched the internet using a tool called CrowdTangle to spot the false narratives he knew could change perceptions of the results: damaging stories about ballots being collected and dropped off in bulk by unauthorized people, who the misinformation peddlers called “ballot mules.” Littlewood, the vice president for campaigns with the voter advocacy group Common Cause, easily came across dozens of posts showing a “Wanted” poster falsely accusing a woman of being a ballot mule in Gwinnett County, Georgia. He raised alarm bells with Facebook and Twitter. “This was going to lead to threats and intimidation of this individual who may be an elections worker, and there was no evidence that this person was doing anything illegal,” Littlewood said. “It needed to be removed.” Meta Platforms Inc.’s Facebook owns the search tool Littlewood used, and the company has for months kept its plans for CrowdTangle a mystery. Meta has been reducing its support for the product. The company is expected to eventually scrap it, and has declined to say when it plans to do so. Not knowing the future of CrowdTangle or what Meta chooses to replace it with, Littlewood said, endangers planning for future elections. The group has thousands of volunteers working in shifts to identify false information online, and CrowdTangle is indispensible to the process. Common Cause’s work “would be impossible to do without a tool that looks across Facebook,” Littlewood said. CrowdTangle gives insight into posts on Instagram, Twitter and Reddit too. “And we all know that the midterms are testing grounds for 2024, when the level of disinformation will be even higher.” Researchers don’t just rely on the tool, but on the companies reacting to the harmful content reports they make. Twitter removed the misinformation Littlewood flagged in May; on Facebook, which didn’t respond to his warning, at least 16 of the posts remained in mid-June. Facebook took them down after media outlets, including ProPublica and Bloomberg News, reached out.

On May 17, as several states held their primary elections, Jesse Littlewood searched the internet using a tool called CrowdTangle to spot the false narratives he knew could change perceptions of the results: damaging stories about ballots being collected and dropped off in bulk by unauthorized people, who the misinformation peddlers called “ballot mules.”

Littlewood, the vice president for campaigns with the voter advocacy group Common Cause, easily came across dozens of posts showing a “Wanted” poster falsely accusing a woman of being a ballot mule in Gwinnett County, Georgia. He raised alarm bells with Facebook and Twitter. “This was going to lead to threats and intimidation of this individual who may be an elections worker, and there was no evidence that this person was doing anything illegal,” Littlewood said. “It needed to be removed.”

Meta Platforms Inc.’s Facebook owns the search tool Littlewood used, and the company has for months kept its plans for CrowdTangle a mystery. Meta has been reducing its support for the product. The company is expected to eventually scrap it, and has declined to say when it plans to do so. Not knowing the future of CrowdTangle or what Meta chooses to replace it with, Littlewood said, endangers planning for future elections. The group has thousands of volunteers working in shifts to identify false information online, and CrowdTangle is indispensible to the process. …

Common Cause’s work “would be impossible to do without a tool that looks across Facebook,” Littlewood said. CrowdTangle gives insight into posts on Instagram, Twitter and Reddit too. “And we all know that the midterms are testing grounds for 2024, when the level of disinformation will be even higher.”

Researchers don’t just rely on the tool, but on the companies reacting to the harmful content reports they make. Twitter removed the misinformation Littlewood flagged in May; on Facebook, which didn’t respond to his warning, at least 16 of the posts remained in mid-June. Facebook took them down after media outlets, including ProPublica and Bloomberg News, reached out.