Meta Contract Ends, Kenyan AI Trainers Laid Off

Meta ended its contract with Kenyan subcontractor Sama, a decision that Sama says will lead to 1,108 redundancies, according to BBC reporting. The action followed February reporting by Swedish outlets Svenska Dagbladet and Goteborgs-Posten that some Kenya-based workers had reviewed sensitive footage captured by Ray-Ban smart glasses; BBC and The Guardian report Meta had paused work with Sama while investigating. Meta told BBC News it had "decided to end our work with Sama because they don't meet our standards," while Sama issued a statement defending its operational and security practices. UK and Kenyan regulators have opened inquiries, and labour advocates have criticised the speed and scale of the layoffs, per The Guardian and BBC.
What happened
Meta ended its contract with Nairobi-based outsourcing firm Sama, and Sama said the termination would result in 1,108 Kenyan staff being made redundant, according to BBC reporting. The contract halt followed February reporting by Swedish newspapers Svenska Dagbladet and Goteborgs-Posten that some Sama workers had reviewed videos recorded by Ray-Ban smart glasses; BBC and IBTimes summarised the original investigative accounts as describing workers seeing highly private footage. Meta told BBC News it had "decided to end our work with Sama because they don't meet our standards," and Meta also told BBC it had paused work with Sama while looking into the claims. Sama issued a public statement saying it "has consistently met the operational, security and quality standards required across our client engagements, including with Meta," as reported by BBC and The Guardian.
Technical details
Per public reporting, the disputed material originated from wearable-device footage that Meta sometimes routes into human review workflows when users share recordings with Meta AI, a practice Meta acknowledged in contemporaneous coverage cited by BBC. News reports also state that Meta applies automated and human filtering measures such as face-blurring before contractor review, according to IBTimes summary of Meta's comments. The accounts in the Swedish outlets described content reviewers being exposed to images and video showing intimate or private moments, including people changing clothes and sexual activity; those worker accounts were published by Svenska Dagbladet and Goteborgs-Posten and then amplified by international outlets, per BBC and The Guardian.
Industry context
Editorial analysis: Content pipelines that mix user-captured imagery, automated pre-processing, and downstream human review are a common pattern across companies that train and validate vision systems. Observers note that wearable devices create distinct privacy and consent challenges because recordings can capture bystanders and private settings, and those challenges complicate vendor relationships when sensitive material reaches outsourced reviewers.
Regulatory and labour response
Reporting states regulators have taken notice: the UK Information Commissioner's Office opened inquiries after the Swedish reporting, and Kenya's Office of the Data Protection Commissioner announced an investigation, per BBC. Labour and civil-society groups quoted by The Guardian and other outlets described the layoffs as sudden and harmful; The Guardian cited Oversight Lab calling the dismissals "devastating and shocking." Sama said it was supporting affected employees and described its compensation and wellness provisions in statements reported by The Hindu and The Guardian.
What to watch
Editorial analysis: Observers should track:
- •findings or guidance from the UK ICO and Kenya's data-protection authority about whether sharing wearable-device footage with contractors met applicable privacy and consent standards
- •any disclosures from Meta or vendors about changes to data-handling, pre-filtering, or contractor oversight policies
- •legal or labour actions from affected workers or advocacy groups, which outlets including The Guardian report are being considered. These indicators will clarify whether this episode leads to sector-wide policy or contractual changes around wearable-device data and third-party reviewers
Practical implications for practitioners
Editorial analysis: Teams building or contracting for vision-model data pipelines should treat wearable-sourced content as higher-risk for consent and privacy exposures and review contractual controls, automated pre-filtering effectiveness, and reviewer wellbeing supports. Industry reporting on this case highlights the operational and reputational risks that arise when sensitive user-captured footage enters outsourced human-review workflows.
Scoring Rationale
The story has notable implications for AI training supply chains, contractor management, and privacy compliance-areas relevant to practitioners who build or procure vision-data workflows. It is not a frontier technical development, so its impact is significant but not transformational.
Practice with real Ad Tech data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Ad Tech problems


