Page Last Updated: March 12, 2020
Content Moderation on Internet Platforms
The purpose of this research will be to evaluate technology companies’ management of user-generated content, advertisements, and other third-party content to assess implications related to social capital.
Technology companies are grappling with how to moderate user-generated content, political ads and other third-party content hosted on their platforms. The related issues, including terrorist content, disinformation and hate speech, are not captured in SASB standards, despite being potentially large value drivers and of increasing investor interest.
In this research project, SASB staff will:
- Define what SASB shall consider as “content moderation” issues (removal of illegal or objectionable user-generated content, review of third-party ads or products, concerns regarding the mental health of content reviewers) and the industries/business activities to which they apply
- Categorize content moderation issues through the lens of SASB’s General Issue Categories (customer welfare, product safety, employee health & safety, etc.)
- Establish an initial evidence-based view on whether and how content moderation issues have a financial impact on relevant companies
- Provide a recommendation to the Standards Board on whether to proceed to standard-setting
Research Project – Added to Research Program
The project lead is currently consulting with investors, issuers and subject matter experts and conducting other primary research.
Board Meeting Outcomes
- The Board discussed a new research project on Content Moderation on Internet Platforms.
For a full list of past and upcoming Standards Board meetings, as well as associated materials, please visit our Board Meetings Calendar & Archive.
The staff has prepared this summary for informational purposes only. Any Standards Board decisions are tentative and do not change current accounting. Official positions of SASB are determined only after extensive due process and deliberations.