Social media organizations must properly protect, compensate content moderators

Thanks to content moderators, the unseen heroes of social media, people can be relatively positive they won’t stumble across something extremely graphic or triggering on their feed. Everyone deserves to feel safe while scrolling and to know that the disturbing content others post is being monitored and removed relatively quickly.    

The issue arises, however, when real human beings are forced to sift through the troubling flagged content from the internet. Instead of putting their well-being at risk, social media organizations should replace content moderators with technology or, at the very least, should actively protect moderators from negative effects. 

Moderators working at companies like Facebook and Twitter are the people responsible for looking through all of the flagged content on the sites to protect other users. This flagged content includes disturbing depictions of rapes, suicides, beheadings and other killings, according to The New York Times

Viewing these images once would leave anyone with some emotional trauma, but content moderators are forced to see these horrifying pictures hundreds of times each work day.

In order to cope with the psychological trauma they’ve accumulated at work, some moderators have taken to drugs and sex in the workplace, according to The Verge. These are clear signs of distress and should not be so quickly overlooked as they currently are. 

In addition, moderators are severely underpaid for the emotionally and psychologically taxing work they do. Some workers make as little as $28,800 per year, just barely over the federal poverty line for a family of four, according to The Verge. 

While employers do sometimes provide counselors to moderators in the workplace, occasional counseling is not enough to prevent them from developing serious disorders, such as post-traumatic stress disorder, from their jobs.

As time has passed, former traumatized employees have started breaking their silence and coming forward in attempts to expose the horrors of content moderation. Former moderator Selena Scola has even gone so far as to file a lawsuit against Facebook, claiming that her PTSD developed due to the images she was exposed to while working with the company. 

Having experienced it firsthand, Scola understands the difficulty of replacing the entire internet moderation system. She only asks that the current procedure be improved, “urging Facebook to establish a fund to create a testing and treatment program through which current and former content moderators—including moderators employed by a third party—can receive medical testing and monitoring including psychiatric treatment,” The New York Times reported.  

Apparently, it’s too much to ask for people to avoid posting horrifying and disturbing images on the internet. It isn’t too much, however, to demand that procedures are put in place to protect the people who are subjected to these depictions. 

If we must have real-life people moderating the internet, we need to put their well-being first. Rather than allow content moderation to go on as it has been since the beginning of social media, it is long past time we protect those that work day in and day out to protect us.

In