YouTube product chief Neal Mohan pushed back against criticism that the video giant promotes “extreme” content to drive more views — and generate more ad revenue — in an interview with The New York Times on Friday.
“It’s not in our business interest to promote any of this sort of content,” Mohan said. “It’s not something that has a disproportionate effect in terms of watch time. Just as importantly, the watch time that it does generate doesn’t monetize, because advertisers many times don’t want to be associated with this sort of content.”
He added it was “purely a myth” to think YouTube deliberately promotes radical videos.
Mohan’s response came after he was asked if YouTube’s algorithm often sent viewers down a “rabbit hole” by recommending increasingly polarizing or “extreme” videos to watch. What constitutes “extreme” content wasn’t explicitly defined, but the Times pointed out the Google-owned site was recently sent scrambling to block new uploads of the Christchurch mosque shooting.
YouTube recommendation algorithm has been heavily scrutinized in the last two years. An investigation from The Wall Street Journal last year showed its recommendations “often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content.” This apparent tech blind spot has turned YouTube into a “radicalization machine for the far-right,” where teenagers are inundated with fringe political views, The Daily Beast said last December.
Notably, Boston Celtics guard Kyrie Irving, who said he was a “big conspiracy theorist” in 2017, said last October that his belief that the world was flat was exacerbated by going down a YouTube rabbit hole. (YouTube recently announced its algorithm would cut back on recommending flat-earth conspiracy videos, among other forms of “borderline content.”)
Mohan on Friday said it simply didn’t benefit YouTube or increase the time viewers spent watching videos to pepper them with radical content.
“It is not the case that ‘extreme’ content drives a higher version of engagement or watch time than content of other types,” Mohan said. He added that, while there are times where video recommendations may seem “more extreme,” there are other recommendations “that skew in the opposite direction.”
YouTube has taken steps in recent months to make its platform more “advertiser friendly,” but has continued to run into roadblocks along the way. Last month, YouTube purged hundreds of accounts after a “soft-core pedophilia ring” had been exposed, where users routinely shared links to child pornography in the comments section of videos with underage kids. YouTube ultimately decided to ban comments on nearly all videos featuring minors.