Thank You For Your Transparency Report, Here’s Everything That’s Missing



Website/link: Visit website

Published date: October 13, 2020

Author: Svea Windwehr and Jillian C. York

Subject tag: Algorithmic systems | Data Access

Transparency must provide context…”In order to give users agency vis-à-vis automated tools, companies should explain what kind of technology and inputs are used at which point(s) of content moderation processes. Is such technology used to automatically flag suspicious content? Or is it also used to judge and categorize flagged content? When users report content takedowns, to which extent are they dealing with automated chat bots, and when are complaints reviewed by humans? Users should also be able to understand the relationship between human and automated review—are humans just ‘in the loop’, or do they exercise real oversight and control over automated systems? … Another important pillar of meaningful transparency are the policies that form the basis for content takedowns. Social media companies often develop these policies without much external input, and adjust them constantly…. transparency reports should describe and explain how human and machine-based moderators are trained to recognize infringing content.”
[This entry was sourced with minor edits from the Carnegie Endowment’s Partnership for Countering Influence Operations and its baseline datasets initiative. You can find more information here:]