Instagram Apologizes After New TV Service Recommended Suggestive Images of Young Girls, Genital Mutilation

“We care deeply about keeping all of Instagram — including IGTV — a safe place for young people”

IGTV
Source: Unsplash / Katka Pavlickova

Instagram apologized Friday after videos of genital mutilation and suggestive videos of children populated on the “Popular” section of its recently launched TV service, IGTV. The videos were brought to the company’s attention by Business Insider, which spent several weeks monitoring the TV service.

“We care deeply about keeping all of Instagram — including IGTV — a safe place for young people to get closer to the people and interests they care about,” an Instagram spokesperson told TheWrap via email.

“All the content reported to us by Business Insider has been removed from IGTV,” they added.

While monitoring IGTV over a period of three weeks, Business Insider (BI) found that the company’s algorithm recommended “disturbing and potentially illegal videos.” The content in question involved a girl that BI said was around 11 or 12 years old taking off her top in a bathroom. The video, which was titled “Hot Girl Follow Me,” ends before she is unclothed. Another video showed what appeared to be an underage girl exposing her stomach while pouting, BI reported.

According to the outlet, the content remained on the site for five days and was only removed after Business Insider contacted the company’s press office. By the time the two videos had been removed, they accumulated more than 1 million views.

Videos of genital mutilation also populated the service. One video contained footage of a penis which appeared to have a metal lug nut stuck around its middle. The nut, per the report, was being removed with an electric saw. IGTV quickly removed the video after it was reported as nudity. However, the account attached to the video remained active.

“We take measures to proactively monitor potential violations of our Community Guidelines and just like on the rest of Instagram, we encourage our community to report content that concerns them,” the spokesperson added. “We have a trained team of reviewers who work 24/7 to remove anything which violates our terms.”

Over the past few years, Instagram says it has reassessed priorities, reassigned engineers and researchers, and aligned teams to support a safe community. The company recently announced that it was doubling the number of people working across its safety and security teams for Facebook and Instagram to 20,000 by the end of 2018. This includes growing its team of 7,500 content reviewers — a mix of full-time employees, contractors and companies it partners with.

Comments