African tech workers press global social media giants for better conditions
African content moderators urge tech giants to provide adequate mental health care and fair pay in recognition of their grueling but vital role.
African content moderators urge tech giants to provide adequate mental health care and fair pay in recognition of their grueling but vital role.
Mophat Okinyi hates remembering the job he used to do for ChatGPT.
For about $1.50 an hour, he read hundreds of descriptions of pedophilia and incest for the聽 artificial intelligence platform every day. As a quality analyst, his job was to confirm that his subordinates had read and classified potentially harmful content correctly.
"Some of these things are very shocking," says Mr. Okinyi. "We shouldn鈥檛 even talk about some of these texts."
The work left him so traumatized, he says, that he drifted apart from his family and eventually separated from his wife. 鈥淚f you put so much dirty content in your mind, it changes you,鈥 he says.
Now, Mr. Okinyi and some 150 other content moderators working for tech powerhouses, including Facebook and TikTok, hope to form a union to improve their pay and working conditions.
鈥淲e're trying to make this job safe for those who will do it in future and those who are doing it right now,鈥 says Mr. Okinyi.
Their decision to unionize shines a spotlight on the way tech giants use human labor in Africa where, through outsourcing, they hire hundreds of people to remove harmful content from their platforms. African content moderators hope to force tech corporations to provide adequate mental health care and fair pay for everyone who works for them 鈥 including non-traditional employees such as themselves.聽
鈥淯nionization signals that gig work rights are labor rights, and workers deserve the protections provided by law in this field,鈥 says Nanjira Sambuli, a Nairobi-based tech and international affairs fellow at the Carnegie Endowment for International Peace.
Since last year, Facebook鈥檚 parent company, Meta, has been facing lawsuits brought by content moderators in Kenya accusing it of union busting, wrongful terminations and insufficient psychological support, among other infringements.聽聽In one case, Meta claimed it was not the moderators鈥 employer, and was therefore not liable. The court ruled Meta was the 鈥渢rue employer鈥 and the 鈥渙wner of the digital work of content moderation.鈥澛
鈥淭hat鈥檚 the most significant labour rights decision about content moderation I have seen from any court anywhere,鈥 says Cori Crider, co-founder and director of Foxglove, a London-based non-profit that鈥檚 providing legal advice to the moderators. 鈥淚f Facebook is held the true employer of these workers, then the days of hiding behind outsourcing to avoid responsibility for your critical safety workers are over.鈥
But the decision isn鈥檛 set in stone yet, as Meta awaits the outcome of an appeal.
Outsourcing responsibility聽
For years, tech platforms have faced intense criticism around the world for failing to filter divisive content. In Africa, Facebook came under fire last year for alleged inaction over hateful material that eventually incited violence during the war in northern Ethiopia. A study聽last month by Global Witness found extreme and hate-filled ads were approved by YouTube, Facebook and TikTok in South Africa, where xenophobic violence has flared up in recent years.
As a result, tech powerhouses have invested heavily in removing material including hate speech, misinformation and incitement to violence from their platforms. Many of the workers who undertake this vital but gruelling task are hired through outsourcing companies and are based in countries like Kenya, India and the Philippines, which supply quality labor at cheap prices.
In Nairobi, a regional tech hub, outsourcing companies bring talent from numerous African countries to moderate work in different African languages. They include Sama, a San Francisco-headquartered company, which has contracted workers for Facebook and ChatGPT. Majorel, headquartered in Luxembourg, hires labor for Facebook and TikTok.
Global tech companies believe that by outsourcing, they can escape responsibility, says Odanga Madung, a senior researcher at the聽 Mozilla Foundation in Nairobi, whose work focuses on the impact of tech platforms in Africa.
鈥淚rresponsibility has always been good business in the capitalist contexts," he says. 鈥淭aking care of people is expensive, more so if you're exposing them to graphic content on behalf of your users.鈥
Accusations of exploitation of content moderators are not unique to Africa. Moderators in the US and Ireland have in the past sued Facebook for mental health issues related to their work. In Germany, a Berlin-based trade union called Verdi has recently been helping content moderators for TikTok and Facebook to unionize.聽
But the move by African workers to form a union is a novel approach outside the West. At the top of members鈥 list is having regular, professional mental health checkups, and having their pay standardized with those of their peers across the world.
After graduating from university, Mr. Okinyi joined Sama in Nairobi in 2019 for his first job. He worked on various projects for different foreign tech companies, doing data labeling, product classification and other tasks. But it was his content moderation work for ChatGPT, starting in聽2021, that affected him in unforeseen ways.
ChatGPT is a chatbot that takes in a user鈥檚 question then, using a language model created from words from the web, provides an answer. Critics say its reliance on mining the internet makes it vulnerable to toxic material.
For the six months that he worked on ChatGPT, Mr. Okinyi鈥檚 work began early, ended late, and left him emotionally drained.聽
Every day at work, he read some 700 texts about child sexual abuse and flagged them according to their severity. Over eight-hour shifts of reading and labeling this material, he enabled ChatGPT to filter out harmful requests.
Although Sama provided counselors, Mr. Okinyi says, productivity demands at work meant he and other workers barely had time to see them.
OpenAI, ChatGPT鈥檚 developer, didn鈥檛 respond to a request for comment.
The situation was equally appalling for Facebook content moderators at Sama. Kauna Ibrahim, a Nigerian, spent four years watching hundreds of horrific videos every day at work, including sexual abuse and beheadings. For roughly three dollars an hour, she assessed whether the videos were in violation of Facebook鈥檚 policies.
During her first year of work, she began suffering panic attacks.
鈥淪ome of the images never leave you. You find yourself unable to sleep. Sometimes you dream of what you have seen,鈥 says Ms. Ibrahim, who was a graduate student in clinical psychology at the time. 鈥淏ut because you do it every day, you just survive.鈥
Sama鈥檚 therapists weren鈥檛 qualified and didn鈥檛 provide enough psychological support, Ms. Ibrahim says. So she resorted to seeking her own therapist.
Sama says it provides 鈥渜ualified and licensed鈥 professionals to provide therapy for its workers, and that it uses an 鈥渋nternationally-recognized鈥 methodology to set wages for its workers, making its pay 鈥渋nternationally comparable and locally specific.鈥
Meta declined to comment because of the ongoing lawsuits.
Ms. Ibrahim was among 260 workers whose contracts were terminated in March 2023 after Sama stopped doing work for Facebook. Sama鈥檚 work for ChatGPT ended in March 2022, and Mr. Okinyi later moved to Majoral to do customer service work for a European e-commerce company.聽
On Labor Day this year, both Mr. Okinyi and Ms. Ibrahim sat alongside about 150 other content moderators for Facebook, TikTok and ChatGPT in a Nairobi hotel, and voted to unionize.聽Mr. Okinyi understands the fight may not be over, but he鈥檚 willing to keep pushing with his colleagues to ensure their voices are heard.
鈥淲e want to be united because if we're united, we become strong,鈥 he says.