Moderating Comments of ‘Deranged Psychos’ Takes Toll on Tech Contractors

Thousands of racist and violent posts are checked each day at big tech firms, new Wall Street Journal report shows

Tech behemoths like Facebook and Google have made it a priority to beef up their content moderation teams, but a new report from the Wall Street Journal shows the mental anguish that comes with reviewing the worst the internet has to offer.

“The first disturbing thing was just burnout, like I’ve literally been staring at porn all day and I can’t see a human body as anything except a possible [terms of service] violation,” a former Google content moderator told WSJ.

But that’s just the tip of the iceberg. Reviewing posts containing racism, violence, and sex abuse is part of the job for the thousands of contractors scanning social media; dealing with images of war carnage or animal mutilation is routine. The Google moderator said reviewing posts on child rape were the most jarring, because “the worst part is knowing some of this happened to real people.”

Silicon Valley firms are feeling the heat to be more proactive in dealing with seedy posts. Facebook CEO Mark Zuckerberg said last month he’s “dead serious” about tackling misleading content, after the fallout from Russian meddling in the 2016 U.S. election. The social network is doubling its ad revue team from 10,000 to 20,000, while adding machine learning tools to weed out inappropriate posts. YouTube announced earlier this month that Google, its parent company, would be moving to staff 10,000 content reviewers in 2018. Twitter also followed suit in adding moderators and tools to spot nefarious tweets.

Still, many of these workers are contractors, making between $13-28 per hour, rather than full-time employees, according to the WSJ. And the job often isn’t worth the pain, with several contractors quitting within their first week. Shaka Tafari, a former moderator for messaging app Whisper, told the WSJ he was caught off guard by the number of rape comments in the messages he reviewed. Pictures of bestiality and dead dogs also added to the mental grind.

“I was watching the content of deranged psychos in the woods somewhere who don’t have a conscience for the texture or feel of human connection,” said Tafari.

The flood of content can be overwhelming. Facebook receives more than a million complaints each day, and one Facebook moderator told the WSJ they review 8,000 posts per day — often with only a few seconds to make a decision. The social network requires PRO Unlimited, the contracting firm that hires many of its moderators, to provide up to three in-person counseling sessions each year.

Read the full report on the “worst job in technology” here.

Comments