For the last few years, I’ve worked in different corners of what can be generally be dubbed as the content moderation industry. First as a contractor, then as an employee for well known social media platforms, where day-to-day work is intimately linked to crucial social outcomes in different parts of the world. The decisions made in that industry can decide what counts as a real identity, the level of nudity deemed socially acceptable, and even what war-images will make it to society’s collective memories. These decisions are often controversial and, for good reasons, should be openly debated and constantly questioned.

The decisions made in that industry can decide what counts as a real identity, the level of nudity deemed socially acceptable, and even what war-images will make it to society’s collective memories.

Long before the content moderation practices of Silicon Valley made mainstream headlines, many specialised organisations at the intersection of technology and human rights have been challenging the social media industry on these practices, demanding more transparency and better accountability processes. Despite that, the debate remains framed in pervasive myths that serve to dilute reality as it obfuscates further the industry’s more questionable practices.

 

These myths are one reason to continue the push for greater transparency in content moderation, not just in terms of value-driven intentions, but also, and mostly, in terms of processes, labour practices, and implementation. I may be naive but I still believe there are people in the industry trying their best to support users and keep them safe. However, way too often, they are caught in the middle of a debate between industry leaders and the public, in a way that is somewhat disconnected from the reality of their work. It’s based on that I want to insist on five aspects of content moderation that should remain at the forefront of the mainstream discussion.

I want to insist on five aspects of content moderation that should remain at the forefront of the mainstream discussion.

1. Tech companies make mistakes all the time

It’s interesting to see the extent to which people still believe in Silicon Valley corporations as precise, almost magical, machines. As consumers, we have first grown to deify technology’s intrusion into our daily life, to only vilify it when it failed to meet expectations. Of course, this is not entirely wrong. US-based companies are grounded in political and economic realities that unequivocally dictate their biases.  However, very often, Hanlon’s razor is the more relevant metaphor for the way content censorship operates. Mistakes happen all the time, but only sometimes make the news.

US-based companies are grounded in political and economic realities that unequivocally dictate their biases.

Categorising them as intentional actions just blur accountability processes. It’s important for the public to know the difference between the unintended consequence of an intended action, and instances where the intended decision-making process broke, for one reason or another.

 

2. So-called automation relies on largely unrecognised human labour.

The Cleaners is an important documentary highlighting the reality of content moderation as a labour practice.  While it may not show the whole picture, it does show the extent of the hidden human labour that is needed to produce ‘clean’ online platforms. The individuals doing this work, usually for outsourced third party companies, are amongst the first to see reported content before it goes viral, or before it makes news headlines. Their routine decisions can have an unpredictably disproportional impact. Perhaps an inevitable consequence of content sharing platform is that this form of labour is necessary.

The individuals doing this work, usually for outsourced third party companies, are amongst the first to see reported content before it goes viral, or before it makes news headlines.

However, and if that is the case, there should be a bigger focus and an open discussion on the conditions of this labour, and clearer legislation on how the interests of individual workers are represented within companies. This is not only for their own benefit, but also for the general good, as they are the ones with the most knowledge of the content moderation industry, and should have more power to shape that industry.

3. Processes matter more than values

The trend in corporate speak for the digital world is to emphasise value-driven missions as the moral compass that frames decision-making processes. These values can be ‘privacy’, ‘authenticity’, ‘gender equality’ or anything else that sounds nice. However, the use of these words is also the quickest way to hide the reality of processes. The truth is none of these words mean anything out of context. Journalists, activists, policy-makers, should question processes rather than outcome. Nymwars highlight how even something seemingly straightforward, like a name, can carry different implications in different societies. There should be a debate around the shape of every single value promoted by every single platform. This is especially true for safety-related issues such as privacy and bullying, which can take different shapes in different places.

Journalists, activists, policy-makers, should question processes rather than outcome.

4. And nobody knows that much about the whole process

This is perhaps the single most important problem in content moderation. Between the decision makers and those responsible for implementing a decision, is a huge pyramid of designers, trainers, translators, programmers and managers reporting to other managers. It would be hard to exaggerate how chaotic the whole operation is. This makes it very hard to assign responsibility because knowledge is spread out across many in-house and outsourced teams, each with very specific expertise, but also limited access to the bigger picture.

It is very hard to assign responsibility because knowledge is spread out across many in-house and outsourced teams, each with very specific expertise, but also limited access to the bigger picture.

The individuals with the most knowledge in content moderation are the reviewers themselves. They also tend to be the ones with the least decision-making power to fix what is wrong. Coupled with minimal documentation and constant changes in technology, tools and policies–the result is that nobody can confidently describe the whole decision-making process after a report has been made. It also means that public-facing employees, those responsible for responding to the questions of journalists, are often those with the least technical understanding of the issues being discussed. The scale and complexity of each one of the many issues are so mind-boggling that most of the time, nobody knows much about anything outside their direct sphere of expertise. If anything, the fact that ownership of processes is so diluted should be the main reason tech corporations should not be allowed to operate in secrecy.

 

 

5. The shape of data matters

The reason a lot of content moderation is still such a labour intensive process is that technology is usually a lot dumber than we give it credit for. Safiya Noble’s latest book Algorithms of Oppression: How Search Engines Reinforce Racism demonstrates the extent to which data reflects, amplifies, and reinforces every human bias. Data is shaping the world and what we know of the world, and this is becoming a literacy problem at all levels. With increasing concern about misinformation online, and pressure on social media companies to address misinformation, we should first start thinking about whether or not we really want social media companies to become a "truth police", and what this means for digital content literacy if we build up an even bigger expectation for the digital world to perform an online mirror to offline reality.

Data is shaping the world and what we know of the world, and this is becoming a literacy problem at all levels.

 

Add new comment

Plain text

  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <br><p>