Content moderation plays a pivotal role in maintaining safe and respectful online environments, but it undoubtedly comes at a significant human cost. With an estimated 100,000 individuals working as commercial content moderators globally, these professionals often face immense psychological and emotional challenges as they sift through disturbing and harmful user-generated content (UGC).
Their work, which is to ensure compliance with social media platforms' terms of service and community guidelines, involves categorizing vast volumes of content. While much of this content is benign, the sheer quantity of non-compliant material—including graphic text, images, audio, and videos—can take a severe toll on their well-being.
Curious about the struggles dating app moderators face during their work hours? Click through the gallery to learn more.