Imaginary Person #1: You ready to get into this Logan Paul thing?
Imaginary Person #2: Yep.
1: Alright, first off: Logan Paul making a video where he visits a Japanese forest best known as a place people go to commit suicide, then filming a dead body and him and his crew cracking jokes about it — that’s straight-up wrong.
2: Absolutely. Indefensible.
1: So we agree on that. But here’s the thing: there’s nothing YouTube could have done about it.
2: Wait. What?
1: I’m saying, YouTube can’t police videos like the one Logan Paul posted.
2: No, no, nuh-uh. YouTube has a responsibility to not let people use its platform to spread this kind of content.
1: Okay, so how would YouTube go about prohibiting a video like Logo Paul’s?
2: Well, YouTube’s owned by Google, and Google’s got all this artificial intelligence technology. So it should be able to use that technology to figure out that videos of dead bodies shouldn’t be allowed on YouTube.
2: Each time a channel in YouTube’s monetization program uploads a video, YouTube should scan each frame of it for things like a dead body. It might slow upload speeds, but YouTube’s upload speeds used to be really slow anyway. It’d just be a return to the way things were.
1: Mmmkay. Except how is YouTube supposed to be able to tell the difference between a dead body and a regular, alive-but-asleep body or even a mannequin?
2: Paul says the words “dead body.” YouTube could also scan the video’s audio for those words.
1: It could. But then a lot of videos might include those words and in different contexts. A news video might mention them. A song might include them in its lyrics. I mean, a vlogger could offhandedly say “blah blah over my dead body.” You want all those videos getting penalized too?
2: So you’re telling me there’s no way for YouTube to have a computer tell when a video stars a corpse and prevent that video from being uploaded?
1: I’m not saying that. It probably could, but I don’t know how accurate it would be. Think about how heavy-handed YouTube already is with flagging videos for demonetization. Now you want them to not even let the video post in the first place?
2: Fine, computers are dumb. But people aren’t. People could flag these videos to YouTube, then YouTube could have the 10,000 people it’s hiring to police videos see for themselves if a video has a dead body.
1: But then it’s up to whichever one of those 10,000 people to decide that 1) this video has a dead body and 2) this video having a dead body is problematic. Some might come to that conclusion, but others might not. One might get a look at zombie Taylor in the “Look What You Made Me Do” video and take it down. Then all the Swifties out there start flaming YouTube for being part of some Kardashian conspiracy. Or one might check Logan Paul’s video and actually believe it’s some sort of suicide prevention PSA and give it a pass.
2: Okay, age-gating.
1: YouTube already has that. Creators can enable age restrictions when uploading videos, and viewers can turn on “Restricted Mode” to filter out videos that YouTube or creators flag as mature content.
2: No, this would be better. People could still use “Restricted Mode,” but there’d also be a version of “Restricted Mode” that would be on by default for everyone, logged in or not. Then YouTube would preempt age-restricted videos with an alert saying that a video is for mature audiences only and ask people to confirm that they’re of age to view it.
1: People will lie.
2: For sure. But that’s okay. The point is to let people know if a video is considered offensive and then leave the decision of whether they should be able to watch it up to them.
2: Exactly. Because when someone catches their kid watching a “Game of Thrones” sex scene, they don’t get mad at HBO for letting it on TV. They get mad at their kid for putting it on their TV.
1: So instead of people getting mad at YouTube for distributing Logan Paul’s video, they should get mad at Logan Paul for making it and others for watching it?
1: Cool cool. But what if the creator doesn’t age-restrict their video because they want as many views — and ad dollars — as possible and YouTube can’t detect that it should be age-restricted because it doesn’t know what a dead body looks like and doesn’t think the words “dead body” are offensive, so the video doesn’t even get slapped with the warning that’s meant to underline the viewer’s responsibility for what they choose to watch? What happens then?