All News

Global Content Moderators Unite to Demand Better Conditions from Big Tech

Contract content moderators working for Big Tech giants like Meta, TikTok, and Google have formed the Global Trade Union Alliance of Content Moderators (GTUACM) to address poor working conditions. Facing trauma from harmful content, low wages, and job insecurity, these workers seek stable employment, mental health support, and union representation. The alliance spans multiple countries and aims to hold tech companies accountable while advocating for safer, sustainable work environments.

Published April 30, 2025 at 09:10 AM EDT in Cybersecurity

Content moderators employed by major technology companies such as Meta, TikTok, and Google have united to form a global trade union alliance aimed at improving their working conditions. Known as the Global Trade Union Alliance of Content Moderators (GTUACM), this coalition was announced in Nairobi, Kenya, and represents contract workers who face significant challenges in their roles.

Content moderators are responsible for reviewing and flagging harmful online material, including violent videos, hate speech, and child abuse imagery. Despite the critical nature of their work, many moderators suffer from severe mental health issues such as depression, PTSD, and suicidal ideation due to prolonged exposure to traumatic content without sufficient support.

In addition to mental health challenges, these workers often face precarious employment contracts, unrealistic performance expectations, and a lack of union representation. Many fear retaliation for speaking out about these issues, which exacerbates their vulnerability.

The GTUACM aims to provide a unified global platform for content moderators to collectively bargain with Big Tech companies. It also seeks to coordinate campaigns and conduct research focused on occupational health and safety within the industry. Currently, unions from countries including Ghana, Kenya, Turkey, Poland, Colombia, Portugal, Morocco, Tunisia, and the Philippines are involved, with others expected to join soon.

The alliance highlights the outsourcing practices of tech giants, which often distance themselves from direct responsibility for the welfare of contract moderators. Lawsuits have been filed against companies like Meta and TikTok by former moderators in Africa and elsewhere, citing psychological harm and retaliation for unionizing efforts.

Leaders within the union movement emphasize the need for living wages, stable employment contracts, humane working conditions, and genuine worker representation. They call on investors and Big Tech companies to prioritize the health and dignity of content moderators who play a vital role in maintaining safe online environments.

This global movement underscores the broader significance of ethical labor practices in the digital economy, especially as content moderation becomes increasingly central to platform governance and user safety. The alliance’s efforts represent a critical step toward sustainable and humane working conditions in the tech industry.

The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte empowers organizations to understand the human impact behind digital content moderation. Explore our insights on ethical AI deployment and workforce wellbeing to build safer platforms. Partner with us to develop strategies that balance content safety with sustainable labor practices in tech.