“People already generally think that we have too much power in deciding what content is good,” CEO Mark Zuckerberg said in a recent interview
When it comes to moderating content, Mark Zuckerberg probably gets the sense that Facebook is damned if it does and damned if it doesn’t.
The social network has been ridiculed in the past for not doing enough to police its platform — which, despite a scandal-filled 2018, still closed the year with 2.3 billion monthly users.
Now, Facebook is once again being skewered by its critics — despite hiring thousands of content moderators in recent years — after The Verge published a harrowing look at what it’s like to spend all day scanning the worst Facebook has to offer: racism; violence; beastiality; self-harm; murder.
For the 1,000 moderators at Cognizant, a Phoenix-based vendor used by Facebook, it’s all part...