Facebook and TikTok failed to block paid advertisements spreading misinformation about voting in the 2022 midterm elections, according to a new report from the human rights non-profit Global Witness and the Cybersecurity for Democracy (C4D) Team at New York University.
For its test, Global Witness created dummy accounts on Facebook, TikTok, and YouTube and attempted to run ten identical advertisements on each platform in both English and Spanish. Each ad contained "outright false and misleading election misinformation." These ads contained falsehoods such as telling voters they needed to be vaccinated to vote in person or that voters would need to vote twice so that their votes would count. Other ads simply provided the wrong date for people to vote. The ads targeted voters in Arizona, Colorado, Georgia, North Carolina, and Pennsylvania – all battleground states. Global Witness then went through the process involved in running this misinformation through each company's official ad platform.
All of the submitted advertisements were in violation of all three platforms' policies.
Yet, YouTube was the only platform to block all of these misinformation-spreading ads. Furthermore, YouTube was also the only platform that blocked the accounts that attempted to run the ads.
"For years we have seen key democratic processes undermined by disinformation, lies and hate being spread on social media platforms – the companies themselves even claim to recognise the problem," said Global Witness Senior Advisor Jon Lloyd in a statement. "But this research shows they are still simply not doing enough to stop threats to democracy surfacing on their platforms.”
"YouTube’s performance in our experiment demonstrates that detecting damaging election disinformation isn’t impossible," C4D co-director Damon McCoy said. "But all the platforms we studied should have gotten an 'A' on this assignment."
According to Global Witness and C4D, Facebook approved 20 percent of these ads in English and 50 percent of the ads in Spanish during one test. Another Facebook ad test that was conducted saw 30 percent of the English ads and 20 percent of the Spanish ads approved. One alarming issue with the latter test, however, was that they were submitted from a dummy account based in the United Kingdom, meaning that the account should not have been able to run political ads in the U.S. at all.
Facebook also only deleted one out of the three dummy accounts used to run the misinformation ads.
In a statement provided to researchers, Facebook said "these reports were based on a very small sample of ads, and are not representative given the number of political ads we review daily across the world."
But, Facebook did not fare the worst in this experiment. That designation goes to TikTok.
Unlike YouTube or Facebook, the viral short form video platform doesn't even allow political advertising on its platform. Yet, TikTok approved 90 percent of the ads featuring voting misinformation. The only ads it rejected were the ones claiming a COVID vaccine was necessary to vote.
In a statement provided to researchers, a TikTok spokesperson said that the platform "is a place for authentic and entertaining content which is why we prohibit and remove election misinformation and paid political advertising from our platform."
However, the TikTok account used to submit the voting misinformation ads was not removed until the researchers reached out to the company.
It should be noted that none of these ads were actually shown to users on any of these platforms as researchers removed the ads without running them after they were approved.
It's been well-established that social media platforms have increasingly become weaponized to spread election misinformation. Foreign actors spreading falsehoods on social networks like Facebook about elections in the U.S. was a major story in 2016. And going into 2020, election fraud misinformation ramped up even further on these platforms, alongside the spread of COVID conspiracy theories and dangerous movements like QAnon.
TikTok has grown and matured exponentially since just two years ago and it brings new misinformation problems to the social media mix. Online election misinformation has the ability to create some real-world problems. Just this past week, for example, election fraud conspiracy theorists were staking out ballot drop boxes in order to intimidate voters.
And while Facebook and YouTube have certainly taken action since those election cycles, it's clearly not enough.
The 2022 midterm elections are just weeks away, on Tuesday, Nov. 8. That doesn't leave much time for social media platforms to get their acts together. But it certainly is plenty of time for misinformation to spread.
via IFmashable.com
0 Response to "Experiment finds TikTok, Facebook approve ads pushing election misinformation"
Post a Comment