Twitter on Friday said it’s making changes to reduce the spread of misinformation and viral tweets heading into the 2020 U.S. election, including adding warning prompts before users retweet information.
The company, in a blog post from general counsel Vijaya Gadde and head of product Kayvon Beykpour, said users will start seeing a screen when they go to retweet, asking users to “add their own commentary.” The prompt will add “some extra friction,” Gadde and Beykpour said, with the “hope it will encourage everyone to not only consider why they are amplifying a tweet, but also increase the likelihood that people add their own thoughts, reactions and perspectives to the conversation.”
Some Twitter users will see the change starting on Friday, before rolling out to all users on October 20.
On top of that, Twitter said it will start adding “additional warnings and restrictions” that have misleading information from U.S. political figures, with specific measures to address the upcoming election. (This includes political candidates, campaign accounts, U.S. accounts with more than 100,000 followers, and those that “obtain significant engagement.”)
“People on Twitter, including candidates for office, may not claim an election win before it is authoritatively called,” Gadde and Beykpour said. “To determine the results of an election in the US, we require either an announcement from state election officials, or a public projection from at least two authoritative, national news outlets that make independent election calls. Tweets which include premature claims will be labeled and direct people to our official US election page.”
And starting next week, users who go to retweet information that violates Twitter’s misinformation rules will be pointed to “credible information” on the topic.
Twitter’s update comes a few days after Facebook said it would block political ads from running once polls close on November 3, in order to prevent politicians from prematurely claiming victories.