A nonprofit that Elon Musk sued earlier this year over alleged lost ad dollars released a report Tuesday saying X has failed to remove antisemitic, Islamophobic, and anti-Palestinian content inspired by the Israel-Hamas conflict on its platform.
Center for Countering Digital Hate says 98% of 200 tweets that it flagged to the social media platform are still up for users to see after two weeks. All the posts in the sample breached some form of X platform rules against hateful content, dehumanization, and hateful imagery.
The CCDH study comes as there has been a significant rise in hate speech and misinformation proliferating across social media since the outbreak of war in the Middle East on Oct. 7.
The nonprofit organization found that the posts still up on the platform included the incitement of violence against Muslims, Palestinians, and Jewish people, references to Palestinians in Gaza as “animals,” Holocaust denial, the glorification of Nazism, and much more.
These posts have garnered over 24 million views in total. Additionally, 43 out of the 101 accounts in the sample are paid verified accounts, which are algorithmically boosted allowing for more post views.
“After an unprecedented terrorist atrocity against Jews in Israel, and the subsequent armed conflict between Israel and Hamas, hate actors have leapt at the chance to hijack social media platforms to broadcast their bigotry and mobilize real-world violence against Jews and Muslims, heaping even more pain into the world,” CCDH CEO Imran Ahmed said in a statement.
“X has sought to reassure advertisers and the public that they have a handle on hate speech – but our research indicates that these are nothing but empty words,” the CEO continued.
Ahmed claimed that “Musk has created a safe space for racists, and has sought to make a virtue of the impunity that leads them to attack, harass and threaten marginalized communities.”
In August, X Corp. sued CCDH for publishing unfavorable research which Musk claimed hurt the platform’s advertising business substantially.