YouTube Removed Millions of Spam Channels and Videos, Says Creators Could See Subscriber Count Decrease

58 million videos and 1.67 million channels were purged from YouTube between July and September

Last Updated: December 13, 2018 @ 2:05 PM

YouTube removed tens of millions of videos and accounts during the third quarter, mostly for violating its policy against spam content, according to a report released by the video giant on Thursday.

From July to September, 1.67 million channels —  with 50.2 million combined videos — were kicked off YouTube for violating its rules. Nearly 80 percent of the purged channels were taken down for sharing spam content, according to YouTube’s report; 12.6 percent were removed for nudity or adult content, and another 4.5 percent were taken down for child safety.

On top of that, another 7.85 million videos were jettisoned for individually violating YouTube’s terms of service. Altogether, YouTube erased 224 million comments during the quarter, mostly for posting spam.

Separately, YouTube tweeted on Thursday that creators could see a “noticeable” drop in subscribers by the end of Friday, after identifying an issue that failed to remove spam subscribers. The Google-owned company said creators will get a notification on spam subscribers that are deleted from their channels.

The Thursday report came only days after Google CEO Sundar Pichai testified before Congress — where Rep. Jamie B. Raskin (D-MD), pressed Pichai on a Washington Post report from last week on YouTube’s inability to weed out conspiracy videos.

“This is an area we acknowledge there’s more work to be done,” Pichai said. “We have to look at it on a video-by-video basis, and we have clearly stated policies, so we’d have to look at whether a specific video violates those policies.”

YouTube’s report said that 90 percent of the approximately 10,300 videos kicked off for violent extremism in September had less than 10 views; the same goes for the nearly 280,000 videos removed during the month for child safety concerns. The company removed 6,195 videos during September for violating its policy against “hateful or abusive” content — accounting for less than 0.5 percent of the videos kicked off that month.