Most Americans have little faith major social media platforms like Facebook and Twitter are doing the right thing when it comes to both censorship and policing misinformation, according to new research from Pew on Wednesday.
Two-thirds of Americans said they had little or no confidence in companies "being able to determine which posts on their platforms should be labeled as inaccurate or misleading," Pew said. (In related news, Facebook last week said it had purged 7 million "harmful" posts on COVID-19 between April and June.) Pew's data was based on 4,708 survey respondents from mid-to-late June.
At the same time, Pew found 73% of Americans believe it's "very" or "somewhat" likely tech giants intentionally censor political viewpoints they disagree with. There was a stark divide on this point along party lines, with nine out of 10 Republicans saying it was at least somewhat likely platforms censored certain political takes, while only 59% of Democrats said it was very or somewhat likely.
The results come as Facebook and Twitter have been more willing to censor high-profile accounts like President Trump's in recent months. Twitter, for example, has added fact-check and warning labels to some of the president's tweets since late May. Facebook, meanwhile, pulled down a Trump post in June that included a red triangle; the company said it violated its policy against "organized hate" because it had been used as a symbol in Nazi Germany, but Trump's campaign pushed back strongly afterwards, saying it represented Antifa and that its ad was criticizing the group's recent involvement in protests and riots. And both companies recently ordered Trump to remove a post claiming kids are "virtually immune" from COVID-19.
Despite some of its recent decisions, Facebook has looked to differentiate itself from Twitter when it comes to the censorship debate. Last month, Facebook CEO Mark Zuckerberg argued before Congress his company has "distinguished" itself by supporting free expression more than its tech contemporaries. (You can read more about the wild and inconsistent Big Tech hearing by clicking here.)
Pew's data comes a week after a 20,000 person survey from Piplsay found a majority -- 56% -- of Americans "do not trust social media platforms to combat disinformation and foreign interference ahead of the election."
Facebook in particular was criticized for its inability to weed out Russian trolls during the 2016 election. Kremlin-backed accounts often shared poorly-worded memes touching on hot button topics like gun control and immigration. One meme, shared during a 2018 congressional hearing investigating election interference, claimed "President Hillary Clinton will definitely end private gun ownership in the U.S." You get the idea.
"We were behind in 2016," Zuckerberg said earlier this year. "But by any objective measure, our efforts on election integrity have made a lot of progress." He pointed to the company hiring more people dedicated to catching political misinformation, as well as AI tools to help spot "inauthentic" behavior.
Still, there is little reason to believe bogus Russian ads played a major role in shaping the results of the 2016 election. A study from Oxford University found Russian trolls, from the St. Petersburg-based Internet Research Agency, spent less than $75,000 on Facebook ads between 2015 and 2017. Facebook, for reference, made $7 billion in revenue during the third quarter of 2016.