Saturday, May 18, 2024
HomeOpinionEditorialsTwitter’s Lack of Moderation Leaves Room for Harmful Communities

Twitter’s Lack of Moderation Leaves Room for Harmful Communities

- advertisement -

The month after his Twitter takeover, Elon Musk promised to make the removal of child exploitation on Twitter a priority. However, he has since axed a majority of the moderation team that was responsible for child safety, leaving room for exploitative material and other harmful content. Twitter regulations are lenient and poorly moderated, and the site has acted as a refuge for harmful internet communities that have been removed from other platforms

However, the issue lies in Musk’s layoffs late last year. As of December, there are less than 10 people dedicated to moderating problematic material, and according to SFist, “Twitter took an average of 1.6 days to respond to 98 notices; last December, after Mr. Musk took over the company, it took 3.5 days to respond to 55.”  Some of the most harmful content on the site include child pornography, “Eating Disorder Twitter” and “Self- Harm Twitter.” Sites with this level of harmful content, such as Pornhub or 4chan, typically allow users 18 and older, but not children as young as 13-years-old without an age verification process. 

Despite Musk’s claim that the removal of child sexual exploitation was the first priority, a video posted to Twitter of a young boy being sexually assaulted garnered over 120,000 views in February, with the site suggesting accounts with similar material. Authorities consider child exploitation and similar material to be easy to detect and eliminate. 

Twitter has also become home to a “pro-anorexia,” or “pro-ana,” community that was banned from Tumblr. The pro-anorexia community ran rampant on Tumblr until moderators from the site began to notice a clear pattern of glorification and promotion of self-harm in 2012. 11 years later, the problem is still running rampant on Twitter with no clear plan to remove the content. 

Within the “pro-anorexia” community,  Twitter users also participate in “edtwt,” or Eating Disorder Twitter. Thinspo, Meanspo and Fatspo are titles for the threads or compilations these users create. The title combines Inspiration, “spo,” and the aesthetic they use to inspire them to encourage their disordered habits. Plus-size individuals on TikTok have expressed their concerns about the trend of fatspo. In order to properly address this issue, Twitter needs to prioritize the safety of its users and increase its content moderation staff. 

Self-harm Twitter, abbreviated as shtwt, is another subset of the platform, which runs the risk of encouraging users to self-harm. Users often post pictures of their cuts or burns while commenters cheer them on, encouraging the deterioration of both the physical and mental state of depressed individuals. This poses a threat to youth in particular, since the majority of mental health issues develop around the age of 14

While some might argue that some of these threads provide a channel to discuss issues of mental health, Twitter is simply not moderated enough to facilitate productive and healthy interactions. Moderators rarely take down these threads, and there are simply not enough of them to begin with. With one search of the terms “meanspo,” “fatspo” or “shtwt,” hundreds of posts will appear, and even worse, children under 13, who are especially susceptible to being negatively influenced by this discourse, have access to this content. While every platform has the capacity to harm children in this way, Twitter’s lenience in both the permissible content and age of the users makes it an especially harmful platform. The staff shortage has also led to Twitter moderators being unable to handle reports they get for the content they don’t allow on the site 

Twitter can barely respond to the influx of child exploitation material, let alone the smaller subsets of communities dedicated to encouraging self-harm. Twitter is unable to acknowledge its incompetence in ridding the site of dangerous and harmful material while failing to protect children by allowing users 13 and up to access this content with no age verification process. If Musk’s priority is truly protecting children on Twitter, he needs to raise the age limit of users and hire more moderators. 

Stephany Lopez-Cortez is an Opinion Intern for the spring 2023 quarter. She can be reached at lopezcos@uci.edu.