I was a content moderator for Facebook. I saw the real cost of outsourcing digital labour | Sonia Kgomo

3 hours ago 1

Mark Zuckerberg might be done with factchecking, but he cannot escape the truth. The third richest man in the world announced that Meta will replace its independent factchecking with community notes. I went to the AI Action Summit in Paris this week to tell tech execs and policymakers why this is wrong.

Instead of scaling back programmes that make social media and artificial intelligence more trustworthy, companies need to invest in and respect the people who filter social media and who label the data that AI relies on. I know because I used to be one of them.

A mum of two young children, I was recruited from my native South Africa with the promise to join the growing tech sector in Kenya for a Facebook subcontractor, Sama, as a content moderator. For two years, I spent up to 10 hours a day staring at child abuse, human mutilation, racist attacks and the darkest parts of the internet so you did not have to.

It was not just the type of content I had to watch that gave me insomnia, anxiety and migraines, it was the quantity too. In Sama we had something called AHT, or action handling time. This was the amount of time we were given to analyse and rate a piece of content. We were being timed, and the company measured our success in seconds. We were constantly under pressure to get it right.

You could not stop if you saw something traumatic. You could not stop for your mental health. You could not stop to go the bathroom. You just could not stop. We were told the client, in our case Facebook, required us to keep going.

This was not the life I imagined when I moved to Nairobi. Isolated from my family, my only real community was my colleagues at Sama and other outsourcing companies. When we gathered, our conversations always circled back to the same thing: our work, and the way it was breaking us.

The more we talked, the more we realised something was happening that was bigger than our personal stories. Every content moderator, data annotator and AI worker we met had the same stories: impossible quotas, profound trauma and a disregard for our wellbeing.

It was not just a Sama problem. It was not just a Facebook problem. It was the way the entire tech industry operated – outsourcing the most brutal digital labour and profiting from our pain.

These issues are currently the subject of a class action lawsuit in Kenya brought by 185 former content moderators against Sama and Facebook’s owner, Meta, as reported by the Guardian. When approached for comment, Sama said that as of March 2023, they were no longer involved in content moderation and no longer employ content moderators. With regards to the current litigation, the Kenyan court has requested that the parties not speak with media regarding the case, they added.

I left Sama two years ago. Since then, the problems have got worse. I know this through helping data supply chain employees working for other outsourcing companies organise in Nairobi through African Tech Workers Rising. Workers are still traumatised, and the work is more intense. Content moderators are having to watch videos at 2x or 3x speed on several screens at once. The pay and conditions are not better, with wages for some data workers being as low as US$.89 (70p) an hour and content moderators getting US$2.

Things cannot continue as they are, but Zuckerberg’s approach of weakening protections is the wrong course. This work needs to be professionalised. We need standards for workers such as content moderators that recognise the difficulties of the work and respect our rights. This means training and real health and safety protocols like in any other profession. This means ensuring a living wage and setting reasonable work quotas. This means creating a framework that respects our humanity and dignity. This means having a union.

Meta declined to comment on the specific claims while litigation is ongoing, but said it required outsourcing firms to offer counselling and healthcare, and pay above local industry standards – and said it provided technical solutions to limit exposure to graphic materials, such as blurring and an opt-out of the autoplay function, whereby videos or pictures are played in a nonstop stream.

We cannot wait for tech companies to fix this problem. African Tech Workers Rising is organising for better wages, mental health protections and professional standards in Kenya and beyond. We are doing this because AI is not magic. Behind every algorithm are thousands of hidden workers labelling, training and moderating data under precarious conditions. The labour powering AI remains invisible because many would rather focus on technological innovations rather than the supply chains sustaining them.

If you believe in a safer, more ethical internet, stand with us: support our organising efforts, push policymakers to regulate big tech and demand that AI and social media companies respect all their workers. Change will not come from above – it will come from us. That is the truth.

  • Sonia Kgomo is an organiser with African Tech Workers Rising, a project supported by UNI Global Union and the Communications Workers Union of Kenya

  • Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

Read Entire Article
International | Politik|