While online platforms have contributed to a more connected world, they have been riddled with increasing concerns over user safety. Major factors contributing to these concerns include an exponential increase in globally created User Generated Content (UGC), increasing UGC variety (short videos, memes, GIFs, live audio streaming, etc.), propagation of egregious content and fake news through online mediums, and an increase in malicious activities such as account takeovers.
Such activities, if left unchecked, may have significant repercussions for enterprises, including user attrition, loss of credibility and reputation, and subsequent loss in revenues. To ensure a safe online environment for users, enterprises need to formulate strict Content Moderation (CoMo) and Trust and Safety (T&S) policies. A combination of Artificial Intelligence (AI) / Machine Learning (ML) enabled technology and human moderators perform the moderation at different levels.
In this research, we examine the latest trends defining the T&S sector, especially CoMo and how these trends translate into enterprise focus areas (e.g., regulatory challenges, the well-being of content moderators), as well as provide an overview of the BPS market and service provider landscape.
All industries and geographies
This report studies different aspects of the T&S sector: