Content Moderators: Guardians of the Online Galaxy
State of the Market Report

20 Aug 2021
by Rajesh Ranjan, Manu Aggarwal, Abhijnan Dasgupta, Aakash Jaiswal

While online platforms have contributed to a more connected world, they have been riddled with increasing concerns over user safety. Major factors contributing to these concerns include an exponential increase in globally created User Generated Content (UGC), increasing UGC variety (short videos, memes, GIFs, live audio streaming, etc.), propagation of egregious content and fake news through online mediums, and an increase in malicious activities such as account takeovers.

Such activities, if left unchecked, may have significant repercussions for enterprises, including user attrition, loss of credibility and reputation, and subsequent loss in revenues. To ensure a safe online environment for users, enterprises need to formulate strict Content Moderation (CoMo) and Trust and Safety (T&S) policies. A combination of Artificial Intelligence (AI) / Machine Learning (ML) enabled technology and human moderators perform the moderation at different levels.

In this research, we examine the latest trends defining the T&S sector, especially CoMo and how these trends translate into enterprise focus areas (e.g., regulatory challenges, the well-being of content moderators), as well as provide an overview of the BPS market and service provider landscape.

Scope

All industries and geographies

Contents

This report studies different aspects of the T&S sector:

  • Introduction to T&S – the growing significance of CoMo, types of CoMo, different industries with T&S applications, stakeholders in the T&S ecosystem, pitfalls of inadequate T&S measures
  • Key trends in the content moderation space
  • T&S focus areas for enterprises
  • T&S BPS market overview
  • Service provider landscape

Membership (s)

Trust and Safety

Sourcing and Vendor Management

 

Page Count: 56