Facebook Is Now Less of an Echo Chamber. Twitter Is More of One

Donald Trump’s victory over Hillary Clinton has led to changes at both social media giants

echo-chamber-facebook-twitter-trump
TheWrap

If you were stunned that Donald Trump won the election, we have bad news: You may be one of the millions of Americans who occupy an internet echo chamber.

Trump’s win over Hillary Clinton has led to changes at Facebook and Twitter as the two social media platforms examine the roles that fake news stories and online abuse may have had on the election. The problems are separate but closely related: In both cases, the root cause is people trying to see things they like, and avoid things they don’t.

In some ways, the companies are going in opposite directions: Facebook is trying to prevent users from seeing only stories that pleasantly confirm their biases. Twitter, meanwhile, is trying to protect people from seeing ugly, abusive messages from people whose views diverge strongly from their own. Both companies are struggling with how to find a middle ground where users are exposed to the truth in a civil way.

Adding to their problems: Not everyone wants to know the truth. Or be civil.

Here’s a look at the problems faced by both companies, and what changes they’ve made since Trump’s shocking (to some) election.

Facebook

On Monday, hours after a top Clinton campaign staffer spoke out against Facebook, the Wall Street Journal reported that Mark Zuckerberg’s company would block fake news sites from using its advertising network to generate revenue. The decision came soon after Google did the same thing.

“We vigorously enforce our policies and take swift action against sites and apps that are found to be in violation,” a Facebook spokesperson said in a statement to the WSJ. “Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance.”

Many Hillary Clinton supporters lay some of the blame for her defeat on a Facebook algorithm that feeds false stories to low-information voters — the kind of people who could be tricked into believing baseless claims, for example, that President Obama was born in Kenya. When they click on one bit of false information, they’re led down a rabbit hole of more conservative conspiracy theories, until it seems like all anyone ever talks about are Bill Clinton’s supposed secret son and Benghazi.

The fact that Facebook posts aren’t vetted for accuracy, like those on mainstream news sites, newscasts or newspapers, means more exposure to the kinds of stories that might not make it into the mainstream news media. Many conservatives say that’s good, because it means previously ignored stories can’t be censored. But it also means they aren’t fact-checked by professional journalists, and could be flat-out hoaxes.

One meme going around this week, for example, says Donald Trump won the popular vote. He didn’t. On Sunday night, HBO’s “Last Week Tonight” host John Oliver slammed Facebook as a “cesspool of nonsense” and pointed to a fake story about the Pope endorsing Trump that was shared almost 1 million times.

James Shamsi, a consultant for the social media company Chameleon.LA thinks Facebook’s echo chamber “works perfectly for anything other than politics” because it “shows you the type of content, news and events that align with your taste and views.”

He said Facebook should simply “give an option to turn off algorithmic content distribution for anything related to politics,” which would essentially solve the problem.

“Facebook can also make it so that while you watch videos related to certain candidates, the next video suggested is one about the other candidate. There’s a bunch of things they can do, for instance also just simply implementing a minimum ratio of the opposing views content,” Shamsi told TheWrap. “The fact that barely any of my friends know anyone that voted for Trump, or saw any news that was positive about Trump, shows you how much the algorithm inadvertently divides and shelters us from opposite thoughts.”

On Saturday, Mark Zuckerberg ducked the question of whether Facebook spreads misinformation by calling it a “pretty crazy concept.” But he later said the company must work harder to “flag hoaxes and fake news.”

That isn’t good enough for a group of “renegade employees” who have formed a task force to examine Facebook’s role in spreading fake news prior to the election, according to BuzzFeed.

“It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season,” one Facebook employee told BuzzFeed.

Clinton campaign chief digital strategist Teddy Goff told Politico on Monday that while “everyone has the right to say what they want,” Facebook needs to stop publishing work from content providers with “a record of making stuff up.”

Twitter

People of every political stripe use Twitter to share quick takes, stories and news. A Twitter moment titled “Day 1 In Trump’s America” recently went viral by presenting a “collection of tweets about racist episodes [people of color] are facing now that Trump is our President Elect.”

Trump, meanwhile, has famously used the platform to promote his campaign and mock opponents. Three times in the past week, he has turned his ire on the New York Times.

But the open nature of Twitter has made it easy to reach out and harass, insult and threaten total strangers. Such trolling has become such an issue that Trump’s wife, Melania Trump, plans to make cyberbullying a key issue when she becomes first lady in January.

Twitter announced changes on Tuesday designed to reduce the “amount of abuse, bullying and harassment” to which users are exposed. It expanded its “mute” function to allow people to “mute keywords, phrases, and even entire conversations you don’t want to see notifications about.”

It also announced it had “retrained all of our support teams on our policies, including special sessions on cultural and historical contextualization of hateful conduct.”

But racist and hateful encounters go both ways. Some people see their own insults as productive political expression, and others’ comments as attacks. With the new changes, Twitter will be safer than ever. From insults, but also from “entire conversations.”

If you choose not to see opposing views, you’ll have only yourself to blame next time an election catches you off guard.

Comments