Sama, a company specializing in content moderation, has recently found itself in the middle of controversy due to a series of lawsuits filed against it and its former partner, Meta (formerly known as Facebook). The latest lawsuit, filed by 43 content moderators who claim to have been unlawfully terminated, has put both companies in a tight spot.
According to the moderators, Sama dismissed them without proper notice and without paying them the compensation they are entitled to under Kenyan law. They also allege that Meta’s new content moderation partner, Majorel, has blacklisted all of Sama’s previous employees, making it difficult for them to find new jobs.
The case is just the latest in a series of legal challenges facing Sama and Meta in Kenya. In 2022, Sama and Meta were sued by former content moderator Daniel Motaung, who accused the companies of forced labor and human trafficking, unfair labor relations, union busting, and failure to provide adequate mental health and psychosocial support. Motaung was allegedly laid off for organizing a 2019 strike and trying to unionize Sama’s employees.
Sama dropped Meta’s contract and content review services in response to the lawsuit, and shifted its focus to labeling work (computer vision data annotation). However, the move did not end the controversy surrounding the company. Last December, Ethiopians filed a lawsuit against Meta, claiming that the social media giant had failed to employ enough safety measures on Facebook, which fueled the conflicts that led to deaths, including the father of one of the petitioners, and 500,000 Ethiopians during the Tigray War.
The closure of Sama’s content moderation arm earlier this year, which affected 200 employees, was also controversial. Sama claimed it was necessary to streamline operations, but reports indicated that some moderators were left without work permits. Sama encouraged affected staff to apply for other job opportunities at its Kenya and Uganda offices.
The controversies surrounding Sama and Meta highlight the challenges of content moderation in Africa, where social media use is on the rise and hate speech, misinformation, and violence are prevalent. The lawsuits also raise questions about the responsibility of tech companies in ensuring the well-being of their content moderators, who are often exposed to traumatic content and work under difficult conditions.