Facebook is introducing a new “one strike” rule to crackdown on who can use its livestreaming feature, two months after the deadly mosque shooting in Christchurch, New Zealand, was livestreamed for about 20 minutes on the social network.
Moving forward, Facebook users that violate the company’s “most serious policies” will be hit with a 30-day suspension from using its Facebook Live feature, according to a Tuesday blog post from VP Guy Rosen.
What those policies are, though, remains vague. Rosen pointed to one example in his post, saying “someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time.”
When reached for a more examples of policy violations that would lead to a 30-day ban, a Facebook rep pointed to the company’s rules against “dangerous individuals and organizations.” Under the policy, Facebook said it will take action against users involved in the following: terrorist activity; organized hate; mass or serial murder; human trafficking; organized violence or criminal activity.
The rep said the changes, if they’d been in place in March, would’ve blocked suspected gunman Brenton Tarrant from broadcasting the shooting, where 51 people were murdered, on Facebook Live. The rep did not specify what rule or rules the gunman violated leading up to the attack. Facebook said it removed 1.5 million attempts to reshare the video in the 24 hours following the shooting.
Facebook’s decision comes as New Zealand Prime Minister Jacinda Arden is set to lead a meeting, alongside French President Emmanuel Macron, in Paris calling for major tech companies to sign a pledge, called the “Christchurch Call,” to block violent content online.
“There is a lot more work to do, but I am pleased Facebook have taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it,” Arden said on Wednesday, according to Reuters.