As it faces a slew of damaging media reports about its business practices, Facebook reported a mixed bag for its third quarter earnings on Monday.
The social media giant reported earnings of $3.22 per share — narrowly beating analyst projections of $3.19 — and revenue of $29 billion, compared to the $29.58 billion analysts were looking for. Of that, ad revenue dipped slightly to $28.2 billion this quarter compared to $28.5 billion in Q2.
Facebook’s growth in terms of users is slowing. The company reported 2.9 million monthly active users, which is up 12% from the year-ago quarter, but essentially flat with its most recent quarter. Daily active users was 1.9 million, which was also flat with its second quarter.
“We made good progress this quarter and our community continues to grow,” said Mark Zuckerberg, Facebook founder and CEO. “I’m excited about our roadmap, especially around creators, commerce, and helping to build the metaverse.”
Zuckerberg acknowledged that revenue was impacted by Apple’s recent iOS changes to ad tracking. Last week, Snap saw its stock plummet 23% after narrowly missing analyst expectations in Q3. The company also cited its ad business taking a hit due to Apple’s move.
“We did experience revenue headwind this quarter,” Zuckerbarg said in the earnings call.
The company is additionally facing tougher competition, especially from TikTok and Snap drawing younger users. Zuckerberg mentioned Facebook will continue its investments into video and Instagram Reels, with a particular focus on young adults aged 18 to 29. He called TikTok its “most effective competitor” — the video platform hit 1 billion monthly users in September.
Facebook’s earnings come as the social media giant is dealing with an onslaught from several damaging articles that were published over the weekend and into Monday by a consortium of media outlets who received leaked internal documents from attorneys representing Whistleblower Frances Haugen. Those included from NBC, CNN and the New York Times.
The articles were most regarding Facebook’s issues at stemming misinformation and radicalization on its platform. For example, in 2019, researchers at Facebook started creating fake accounts as part of an experiment to test how the social media app’s algorithm promoted disinformation and polarization. The result: Facebook ended up with incontrovertible proof that it contributed significantly to radicalization, particularly of conservatives.
Worse, almost as soon as it was happening, Facebook employees were aware that the algorithm was increasing the scope and reach of right wing lies and conspiracy theories about the 2020 election in the weeks after Joe Biden’s victory over Donald Trump. But worried that Facebook wasn’t doing remotely enough to stop it.