Facebook can’t allow itself to become an Orwellian “ministry of truth,” according to Alex Stamos, the social network’s recently departed chief security officer. His comments follow Facebook’s decision to ban the digital shock jock Alex Jones.
Stamos, speaking with MSNBC on Sunday, weighed in on his former company’s decision to oust Jones, saying it was prudent of Facebook to take action if Jones was calling out certain individuals and “sending his followers after them.” But Stamos warned of the outsized power and responsibility that comes with determining which voices are allowed to stay on the major tech platforms — alluding to the propaganda machine in George Orwell’s 1984 in the process.
“These companies are incredibly powerful. Some of them are approaching $1 trillion in market capitalization,” Stamos told Kasie Hunt. “And I don’t think it is a good outcome at this time would be to have the massive tech companies effectively operating as a ministry of truth, of deciding who is a journalist or not, what is outside the acceptable bounds of behavior?”
Jones was booted from Facebook for violating “bullying” and “hate speech” policies earlier this month. The conspiracy theorist — who has made several infamous claims, perhaps most glaringly that the Sandy Hook school shooting was a “hoax” — also had his show kicked off Spotify and Apple at the same time.
Stamos, who recently became a professor at Stanford, distanced himself from the strategy when it comes to handling Jones. “I think [major tech companies] need transparent rules around the safety of individuals, and to enforce that in a vigorous and, again, transparent way,” said Stamos. “But I don’t think we should rush into the idea that the best way to deal with people like Alex Jones is to have them disappear off the internet. I don’t think that solves some of the root problems that we’re dealing with.”
Stamos touched on another headache Silicon Valley is scrambling to cure, too, in foreign manipulation of the 2018 U.S. midterms. Facebook was notoriously leveraged by Russian trolls during the 2016 presidential election to spread misinformation to more than 100 million users. Despite Facebook and other tech giants taking several steps to ensure this doesn’t happen again, Stamos said last week’s revelation that Russian and Iranian agents were running political misinformation campaigns in the U.S. showed it’s still an ongoing issue.
“The Russian actors have not been deterred,” said Stamos. “The Russian government is still — wants to be involved in the U.S. elections. And that their playbook is going to be picked up by a number of other countries.”