Facebook on Tuesday said it removed 7 million “harmful” posts containing misinformation about COVID-19 between April and June.
The posts, according to a Facebook spokesperson, were taken down for pushing bogus preventative measures or cures that the CDC and other health officials told the company were dangerous. Another 98 million posts had warning labels attached to them, the spokesperson said. These figures also include posts on Instagram, which Facebook owns.
The announcement comes after Facebook and Twitter removed a video shared by President Trump last week for violating their policies on COVID-19 misinformation.
President Trump’s video, which was shared on his Facebook and Twitter accounts, was a clip from a phone interview he did with Fox News, where he called for schools to reopen, said children were “virtually immune” from COVID-19 and also repeated his claim that the virus would just “go away.”
Also Read: Can Microsoft Pull Off Buying TikTok on a Deadline - and Is the App Worth It?
“It doesn’t have an impact on them,” Trump said of people under the age of 18 in the clip. “They are virtually immune from this problem and we have to open our schools.”
Since the beginning of March, 576 children under the age of 18 have been hospitalized due to COVID-19, the CDC said last week. About 56.6 million students attended elementary, middle and high schools in the U.S. last fall, according to government data.
Trump’s video included “false claims that a group of people is immune from COVID-19 which is a violation of our policies around harmful COVID misinformation,” a Facebook spokesperson told TheWrap last week. Facebook also said it was the first time the company had taken down a post from the president due to COVID-19 misinformation.
Facebook also shared on Tuesday it had removed 22.5 million for violating its hate speech rules during Q2.