New Zealand politicians and businesses are calling on tech giants like Facebook and Google to accept a measure of accountability after last week’s deadly mosque shooting killing 50 people that was live streamed and shared on social media.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published,” New Zealand Prime Minister Jacinda Ardern told Parliament on Tuesday.
“They are the publisher. Not just the postman,” she added. “There cannot be a case of all profit, no responsibility.”
Ardern’s criticism comes right after Facebook released more information on the video of the deadly shooting on Monday night.
Facebook deputy general counsel Chris Sonderby said the live stream of the attack was viewed 200 times and wasn’t reported by anyone watching it; Facebook ultimately received a user complaint 29 minutes after it started, or about 12 minutes after the live stream ended. The video was ultimately viewed about 4,000 times, according to Sonderby, and the video was deleted 1.5 million times in the 24 hours after the attack — including 1.2 million uploads that were instantly removed.
On Friday, a Facebook rep explained that the company’s artificial intelligence tools scanned the original attack video, allowing its system to immediately spot similar scenes and remove most uploads.
Despite the efforts of Facebook, Google-owned YouTube and Twitter to purge the attack video — with varying degrees of success — from their platforms, New Zealand telecom execs felt the companies could’ve done more. The heads of Vodafone, 2Degrees and Spark in New Zealand signed a joint letter on Tuesday calling for Facebook chief Mark Zuckerberg, Twitter chief Jack Dorsey, and Google chief Sundar Pichai to join an “urgent discussion” on how to completely stop violent videos from being shared.
“Content sharing platforms have a duty of care to proactively monitor for harmful content, act expeditiously to remove content which is flagged to them as illegal and ensure that such material — once identified — cannot be re-uploaded,” the execs said in the letter. They added that the attack video “should never have been made available online.”
That might be easier said that done. Looking at the 24-hour period after the attack, Google and Twitter, like Facebook, were leveraging moderators and AI tools to remove posts sharing the attack video. At its peak in the few hours after the shooting, YouTube said on Monday it was seeing up to 1 video of the attack being uploaded each second.
The telecom execs concluded by recommending New Zealand adopt laws similar to Europe, where tech companies are fined if certain content isn’t removed within a specific amount of time.