December 1, 2020
Subject tag: Advertising | Algorithmic systems | Data Access | Government transparency | Privacy and data protection
Companies should disclose comprehensive and systematic data and other information that enables users—as well as researchers, policymakers, investors, civil society, and other third parties—to have a clear understanding of how platforms and services restrict or shape speech and how they assess, mitigate, and provide redress for risks to users. In particular, they should: Publish transparency reports on the enforcement of their rules: Such reports should be released regularly and include comprehensive data on the volume and nature of content that is restricted, blocked, or removed; what kinds of restrictions the company put in place; and why. Companies should avoid aggregating any of this data and should pursue the same high level of detail for every country in which they operate. Regularly report on demands from governments and other third parties: Companies should publish comprehensive information about their processes for reviewing censorship demands, demands for user data, and network shutdowns (in the case of telecommunications companies). For each of these categories, they should release detailed data on the demands they receive, including the number, nature, and legal basis of demands made, as well as the agency or entity making them. They should disclose any legal reasons preventing them from being fully transparent in these areas. They should also commit to notifying users when their data has been requested or provide legal justification when they are unable to do so.
[This entry was sourced with minor edits from the Carnegie Endowment’s Partnership for Countering Influence Operations and its baseline datasets initiative. You can find more information here: https://ceip.knack.com/pcio-baseline-datasets]