November 12, 2020

Building a Foundation to Explore Online Content Moderation

Greg Waters
Analyst, Sector Lead – Technology & Communications, SASB
headshot of Greg Waters

Greg Waters, Analyst, Sector Lead – Technology & Communications

Just under a year ago, the Sustainability Accounting Standards Board (SASB) initiated a research project to explore content moderation on internet platforms. Recognising that these platforms, particularly social media companies, have connected billions of individuals around the world and transformed the way people communicate, the project has explored high-profile challenges these platforms face related to harmful content, user freedom of expression, and worker safety. For much of the year, SASB staff has engaged with companies, investors, and subject matter experts to develop a clearer picture of the relevant social issues related to harmful online content and the business activities to which they might apply.

The purpose of this work has been to move the complex and ambiguous issue of content moderation into a defined set of issues, business impacts, and management practices. Such a taxonomy could then serve as the basis for SASB to evaluate the need for standard-setting in relevant industries. Much of this work is now summarized and publicly available through our Content Moderation Taxonomy document.

With the release of this document, SASB hopes to draw market attention to areas of ongoing research, solicit further input on the body of evidence cited, and provide insight into SASB’s research process.

Below we have provided a few common questions that can help market participants understand the role this taxonomy plays in SASB’s research and standard-setting process, and to better understand the issue of content moderation.

Does the taxonomy change SASB Standards?

No. The taxonomy is a document that establishes SASB staff’s preliminary views on the social implications of harmful online content and will be used as a framework for further research and standard-setting activities across the technology sector. It does not represent the view of SASB’s Standards Board on these issues, nor does it evaluate the financial impacts of content moderation themes on relevant companies.

However, while the taxonomy itself does not change SASB Standards, it helped lead to the Standards Board’s recent decision to initiate standard-setting in the Internet Media & Services industry on the topics of harmful online content and user freedom of expression. Interested parties are encouraged to subscribe to updates and engage with the project lead through the project page on SASB’s website.

What is content moderation?

The taxonomy defines content moderation as the processes and procedures used to detect and potentially take action on a range of illegal or unwanted content. Such actions may include removal of user-generated content, making content more difficult to access, and banning users.

Although content moderation is most commonly associated with social media platforms, a variety of companies operating at other levels of internet infrastructure make decisions that are akin to content moderation. For example, e-commerce platforms may wish to monitor goods sold by third-party sellers, or website building tools may choose not to do business with the creators of objectionable websites.

Using a broader lens that includes the systems used to determine what users see on a given platform (such as algorithms that prioritise content or determine which advertisements appear to users), a more encompassing label for these activities is “content governance.”

What topics does the taxonomy address?

The taxonomy organizes the social externalities tied to online content into four issues:

  • Harmful Content describes a variety of content created and disseminated on the internet that has negative social implications, including child sexual abuse material (CSAM), terrorist and violent extremist content (TVEC), disinformation, hate speech, and some forms of misinformation. Businesses provide services that can facilitate the creation and spread of harmful content in a variety of ways.
  • Freedom of Expression explores how a platform’s decision to restrict or remove content can impact users. This is an area fraught with nuance and risk for platforms, many of which need to determine what forms of content they’re comfortable hosting or amplifying, and the appropriate level of responsibility for this content that they should take.
  • Privacy covers the implications on user privacy of companies searching for harmful or problematic content. It also examines the role that the collection of user data can play in which content people see on social media platforms.
  • Employee Health & Safety explores how the workers tasked with evaluating and removing harmful content are suffering adverse mental health impacts, including post-traumatic stress disorder.

What types of businesses does the taxonomy include?

The taxonomy cites a variety of businesses that are exposed to these themes, including:

  • Social media platforms
  • Messaging services
  • Gaming platforms and publishers
  • Cloud services and other internet infrastructure
  • Business process outsourcing firms
  • Internet service providers
  • E-commerce platforms

These businesses fall under the following industries, as defined by SASB’s Sustainable Industry Classification System® (SICS®): Internet Media & Services, Software & IT Services, Telecommunication Services, and E-Commerce.

Why does SASB currently have two active projects on this issue?

The research project “Content Moderation on Internet Platforms” was established in December 2019 and, like other SASB research projects, it set out to determine if standard-setting was necessary. The standard-setting project “Content Governance in the Internet Media & Services Industry” is more advanced in our process and has a direct objective of evaluating the addition of relevant issues to the industry standard.

What is next for SASB’s research in this area?

The content moderation research project will remain open as staff continues to seek stakeholder input and evaluate additional areas that may warrant standard setting in the future, including that of worker health and safety.

However, staff’s priority in the coming months will be the content governance standard-setting project. Staff anticipates beginning targeted consultations with tech sector analysts, small and large internet platforms, and subject matter experts in early 2021 to support the development of an exposure draft standard. This project will be discussed further at Standards Board meetings in 2021 and we encourage interested parties to listen in, subscribe for project alerts, or contact the project lead to learn more or provide input.

Greg Waters is a SASB Analyst and Sector Lead for Technology & Communications.