Last week, Elon Musk completed his $44 billion acquisition of Twitter. Soon after closing the deal, Musk fired Twitter’s top executives including its CEO, CFO, General Counsel, and Head of Legal Policy, Trust and Safety. More layoffs are expected to be made in the coming days.
Musk has announced that Twitter will be forming a content moderation council and will not reinstate accounts or make major content moderation conditions before it is convened. Musk’s prior statements and use of the platform have raised serious concerns that any proposed changes to Twitter’s content moderation policies and enforcement practices could result in an increase in disinformation, hate speech, and a proliferation of bad actors on the platform. In fact, less than 12 hours after Musk took over the platform there was a dramatic spike in the use of homophobic, transphobic, and racial slurs.
Statement of Yosef Getachew, Common Cause Media and Democracy Program Director
“Musk’s recent actions and statements following his acquisition of Twitter raise serious red flags about the potential for harassment, intimidation, and disinformation that targets vulnerable communities and undermines our democracy. Twitter, a platform with over 300 million monthly active users, requires effective content moderation not just to protect its individual users but to also safeguard against broader threats to public safety and our democracy. Content that attacks people of color, LGBTQIA+ groups, and other marginalized communities should have no place on the platform.
“Twitter has established civic integrity and trust and safety teams with the intention of enforcing the platform’s rules around content designed to harass users, incite violence, suppress votes, or otherwise disrupt our democracy. Any reduction to these teams or the elimination of existing policies would open the door to a drastic increase in the amount of disinformation, hate speech and other harmful content on one of the most widely-used social media platforms. Content moderation policies are only effective if people are there to enforce them and systems are in place to ensure they are being enforced.
“Regardless of the change in leadership, Twitter has significant deficiencies in how it currently addresses election disinformation. Our own research confirmed Twitter stopped enforcement around the ‘Big Lie,’ and whistleblower documents show the many shortcomings in how the platform addresses election disinformation. That is why over 120 organizations have called on Twitter and other major platforms to take greater steps to consistently enforce and expand its existing policies, introduce friction to prevent the amplification of disinformation, and provide transparency over its policies and practices.
“Disinformation agents will push Musk to make rapid changes to Twitter that would open the floodgates for harmful content that suppresses the right to vote, further incites real-world violence, and sows distrust in our institutions. Rather than cave in to conspiracy theorists and propaganda peddlers, we urge Musk to ensure Twitter’s rules and enforcement practices reflect our values of democracy and public safety.”