Medium CEO Outlines Publishing Platform’s ‘Aggressive’ Content Moderation Policies

“If someone is harassing someone on Twitter and they post on Medium, we take them off,” Evan Williams explains

Evan Williams Address Developers Conference
Getty Images

Medium founder and CEO Evan Williams outlined the publishing platform’s content moderation policies at CodeCon 2019, describing his company’s practices as “rather aggressive.” Williams addressed how Medium’s policies differed from Twitter’s, the company he co-founded and served as CEO.

“Our content moderation is rather aggressive in terms of what we are willing to take down,” Williams explained at the annual technology conference on Tuesday. “Having served on the board of Twitter… I’ve always seen these companies as having different roles.”

“With Medium, I don’t think we ever saw our role as being for all voices. There’s other places to do that,” he explained. “We’ve been fairly aggressive about taking content off, including the behaviors of people who wrote it off-platform. So if someone is harassing someone on Twitter and they post on Medium, we take them off even if they otherwise abided by our policies.”

In other words, a contributor’s conduct off the platform can affect his/her ability to publish on it. Williams added that every story promoted by Medium is reviewed by human eyes.

“We actually curate all of Medium,” he explained. “It’s unusual what we do as an open platform. We have dozens of people who spend time going through stories and unless they hit ‘yes,’ it doesn’t get into our content recommendation system. So it can be hosted, but it won’t get the benefit of our distribution.”

“Everything we also recommend on Twitter — if we recommend it, if it gets into our algorithm, it first has to be approved by a human,” Williams added.

Watch his full answer here:

Williams argued that the platforms have different models and need different moderation.

“Our value exchange with independent writers is if you want to be distributed on Medium, we will put you in the curation pool. And if we curate you, we’re going to put you behind the paywall,” he explained. “That’s our exchange instead of, ‘Now we’re going to put ads on your piece but it’s going to have to meet our standards and now we’re going to have to monetize it.’”

Medium’s rules can be found in full here, along with the publisher’s definitions of controversial, suspect and extreme content. Twitter’s rules can be found here.

Content moderation has been a contentious topic at CodeCon. On Monday, YouTube chief Susan Wojcicki defended the platform’s decision to not remove conservative commentator Steven Crowder after he made homophobic and racist remarks on his channel about Vox reporter Carlos Maza, who is gay and Latino. Wojcicki recognized the decision was “hurtful” to the LGBTQ community but the “right” one.

Wojcicki outlined the three ways YouTube determines violations: context, whether the person being criticized is a public figure, and whether the comments are “malicious with the intent to harass.” She added that the company is “working hard” to clean up the video platform, but said scale is an issue. “It’s just from a policy standpoint we need to be consistent. If we took down that content, there would be so much other content we would need to take down.”

Comments