YouTube on Thursday further cracked down on content tied to conspiracy theories like QAnon, saying it was taking steps to “curb hate and harassment” by removing more content “used to justify real-world violence.”
The Google-owned video giant made the announcement in a blog post. Moving forward, for example, content “that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate,” will be deleted.
“As always, context matters,” the blog post added, “so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up.”
YouTube said it’ll start enforcing the policy immediately and will “ramp up” its crackdown in the weeks ahead.
If you’re not familiar with QAnon, here’s how The New York Times summarized the movement recently:
“QAnon was once a fringe phenomenon — the kind most people could safely ignore. But in recent months, it’s gone mainstream. Twitter, Facebook and other social networks have been flooded with QAnon-related false information about Covid-19, the Black Lives Matter protests and the 2020 election. QAnon supporters have also been trying to attach themselves to other activist causes, such as the anti-vaccine and anti-child-trafficking movements, in an effort to expand their ranks.”
Previously, QAnon has said there was a deep state conspiracy to take down President Donald Trump, led by a group running a covert child sex-trafficking ring.
When it comes to QAnon, YouTube has already removed hundreds of channels and tens of thousands of videos tied to the conspiracy theory over the last few years under its existing policies against hate speech and harassment. The decision to expand these policies comes after Facebook last week said it was banning all QAnon content.