An incredible investigative piece in Wired by Adrian Chen reports on the lives of contract content moderators, folks whose job it is to go through content posted to online platforms (such as Facebook, YouTube, Whisper, etc.), and deal with the content that violates a platform’s policies or the law. And yes, we’re talking about the really bad stuff: Not just run-of-the-mill pornography or lewd images, but examples of humanity at its worst, from torture, sexual assault (involving adults, children, and animals), and beheadings.
Just reading Chen’s piece is a traumatic experience in and of itself, knowing what material is out there, what unthinkable behavior real people are engaging in, and what the relentless exposure to this content must do to the psyches of these grossly underpaid contract workers, whose lives are slowly being ruined, their well-being slowly poisoned, post by post and video by video. Simply reading this article will probably require some recovery time.
I can’t have a blog about tech, culture, and humanism without at least acknowledging what Chen has brought into daylight. I don’t think I have any novel observations at the outset, having just read it, still somewhat teetering on my heels. But here are some thoughts and questions that it raises for me:
First, the obvious: Are the major tech companies for whom this work is done really aware of what they put these moderators through? From the Bay Area liberal arts grads to the social-media-sweatshop moderators in the Philippines, hundreds of thousands of smart, sensitive human beings (and I think they must be smart and sensitive to have the kind of judgment and empathy required to do this kind of work) are having their minds eaten alive, losing their ability to trust, to love, to feel joy, with disorders that mirror, or explicitly are, post-traumatic stress. Do Mark Zuckerberg or Larry Page or whoever it is that runs Whisper give a damn? (Given how little Twitter has done to deal with abuse and harassment of its users, I think it’s safe to presume for now that they probably don’t.)
Also, now that we know what these folks are exposed to, what can we as users of these services do about it? What will we do about it? (I fear the answer is probably similar to what we all did when we learned about the conditions in factories in China: more or less nothing.)
Here’s what affected me the most about all of this. This report was a reminder of the depths of human depravity. Now, it’s not news that there are horrible people doing horrible things to each other, and likely ever shall it be. But something about the way it’s described in this report amplifies it for me. If these hundreds of thousands of moderators are being overwhelmed, deluged with violence and death and evil in all manner of their cruelly novel variations, how many of our fellow humans are perpetrators? These moderators are only catching the portion of these people who either get caught in the act or purposefully broadcast their actions. What more must be taking place? I can barely stand to ask the question.
Bearing witness to a video of a man doing something I cannot bear to recount here to a young girl, one moderator points us to the insidiousness of all of this, emphasis mine:
The video was more than a half hour long. After watching just over a minute, Maria began to tremble with sadness and rage. Who would do something so cruel to another person? She examined the man on the screen. He was bald and appeared to be of Middle Eastern descent but was otherwise completely unremarkable. The face of evil was someone you might pass by in the mall without a second glance.
Chen writes of how these moderators no longer feel they can trust the people in their day-to-day lives. You can see why.
Finally, I’ll be thinking about the fact that its these devices and services that I am so fascinated and often entranced by that are the delivery vessels for this horror. It is tempting to relegate one’s thinking about the tech revolution as one of liberation and renaissance. But these tools are available to us all, to the best of us and the worst. What then? What now?