Mark Zuckerberg Is ‘Worried’ About Fake News, and 6 Other Things We Learned From Wired’s Facebook Feature

“I don’t think he sleeps well at night,” one Facebook employee says of Zuckerberg

Talk about a comprehensive look inside “the social network.”

Internal leaks, a decade-long rivalry with Rupert Murdoch, and the fallout from the 2016 U.S. election are all heavily detailed in Wired’s “Inside the Two Years That Shook Facebook — and the World.”

At the center of the action is Facebook CEO Mark Zuckerberg. The 33-year-old wunderkind comes across as a man grappling with the massive responsibility of shepherding a company with 2 billion users, while coming to grips with a problem — fake news — it was slow to address. Now, Zuckerberg and his team of execs are scrambling to combat disinformation before it drives users and advertisers away for good.

If you don’t have time to sift through the meticulously reported 11,000 word feature, here are seven key moments that stood out.

1) Facebook takes leaks seriously

The report starts off by showing that Facebook doesn’t take leaks lightly. Benjamin Fearnow was a contract worker on Facebook’s “Trending Topics,” which highlighted certain news stories when users opened the site. The Columbia grad sent a few internal memos to Gizmodo reporter Michael Nuñez, including one Zuckerberg sent after an employee crossed out “Black Lives Matter” and replaced it with “All Lives Matter” on a company wall.

“We’ve never had rules around what people can write on our walls,” said Zuckerberg in the memo. But “crossing out something means silencing speech, or that one person’s speech is more important than another’s.”

Fearnow was soon surprised by his bosses, who told him they had a copy of his Gchat messages with Nuñez. He was fired. Another worker on the Trending Topics team was fired for “liking” the story on Black Lives Matter and being friends with Nuñez on Facebook.

2) Neutrality was a religious tenet 

Not showing political favor was a guiding principal at Facebook leading up to the 2016 election. The social network doubled down on this mantra after Gizmodo published a story in May 2016, where former Facebook workers said they “suppressed” conservative news and pushed more left-leaning content in its Trending section. After this, “Facebook became wary of doing anything that might look like stifling conservative news,” wrote Wired.

There was a business reason for this, of course. Section 230 of the 1996 Communications Decency Act safeguarded Facebook against responsibility for what its users post. But “if Facebook were to start creating or editing content on its platform, it would risk losing that immunity — and it’s hard to imagine how Facebook could exist if it were liable for the many billion pieces of content a day that users post on its site,” wrote Wired. This would not only be a logistical nightmare, but force Facebook to increase its spending on content review.

3) Long rivalry with Rupert Murdoch

News Corp. CEO Rupert Murdoch made waves last month when he said Facebook should pay publishers a “carriage fee,” similar to pay-TV. But it wasn’t the first time Murdoch has gone after Facebook. Wired details a July 2016 conference where Murdoch confronted Zuckerberg on the “existential threat to serious journalism” Facebook and Google had become. Murdoch said Facebook can expect News Corp. execs to be more vocal in their criticism of Facebook if his concerns weren’t addressed.

The bad blood carried all the way back to 2007. At the time, Facebook was under scrutiny from 49 state attorneys general for doing little to protect young users from sexual predators. But many of the formal complaints were fakes, sent from IP addresses set up by News Corp. workers at Myspace, according to an internal Facebook investigation. This made Murdoch’s skills in the “dark arts” clear to Zuckerberg.

4) FB execs knew Trump used their site better than Clinton

While “almost everyone” on Facebook’s executive team was pro-Hillary Clinton during the 2016 election, they privately admitted Donald Trump was much better at leveraging the social network.

5) FB didn’t know about Russian meddling until months after the election

Facebook was slow to realize Russian trolls used its advertising business to target American voters.

“I would draw a real distinction between fake news and the Russia stuff,” a Facebook exec told Wired. “With the latter there was a moment where everyone said ‘Oh, holy shit, this is like a national security situation.’”

That moment was about six months after the election, after a Time article hinted Facebook and a Senate investigation hadn’t looked at every variable. Facebook’s security team started to sift through its ad archives, searching for accounts with Russian as its native language and if users paid in rubles. Facebook ended up finding a “cluster” of accounts, paid for by the Kremlin-tied Russian Internet Research Agency. This revelation set off a “crisis” within the company, with execs thinking about how to best present this information — or even present it at all.

Facebook shared in September 2017 about $100,000 worth of ads had been purchased. The company eventually told Congressional investigators in November Russian IRA ads had been seen by more than 100 million users.

6) Zuckerberg’s evolution on fake news

The central theme of Wired’s feature is Zuckerberg’s change in attitude on the fake news issue. He was initially dismissive of its impact, saying it was “pretty crazy” to think it tipped the election one way or another.

One employee told Wired that watching Zuckerberg reminded him of Lennie in “Of Mice and Men,” “the farm-worker with no understanding of his own strength.”

But mounting criticism — and data on Russian interference — has forced Zuckerberg to reexamine his stance. He’s also started to prioritize “time well spent” on Facebook, a term used by one of his biggest critics, tech ethicist Tristan Harris.

Before Thanksgiving last year, he gave a speech at Facebook HQ, telling employees it was a “privilege” and an “enormous responsibility” to have more than one billion users logging in each day.

“I don’t think he sleeps well at night,” one employee said. “I think he has remorse for what has happened.”

7) Where’s Facebook at now? 

The optimism that tech would inevitably make a positive impact on the world has been beaten down in the last two years. After being slow to address fake news, Facebook is now working to weed it out. The social network has added tools to check if users have followed Russian IRA-backed pages, reconfigured its News Feed to promote more “meaningful interactions” between friends, and hired former CNN reporter Campbell Brown to spearhead its outreach to journalists.

As for Zuckerberg, Wired says “he has thought deeply; he has reckoned with what happened; and he truly cares that his company fix the problems swirling around it. And he’s also worried.”

Comments