Transparency Requirements for Digital Social Media Platforms: Recommendations for Policy Makers and Industry

Submission/proposal/advocacy/recommendation

Website/link: https://www.ivir.nl/publicaties/download/Transparency_MacCarthy_Feb_2020.pdf

Website/link: Visit website

Published date: February 12, 2020

Author: Mark MacCarthy

Subject tag: Algorithmic systems | Data Access | Privacy and data protection

Recommends a tiered system for transparency which would include: Disclosures about content moderation programs and enforcement procedures and transparency reports are aimed at the general public. Disclosures about prioritization, personalization and recommendation algorithms are provided to vetted researchers and regulators. Vetted researchers are also given access to anonymized data for conducting audits in connection with content moderation programs, while personal data and commercially sensitive data are available only for regulators. Greatly improved access to platform data for qualified independent researchers and regulators. Access to information must be in a form and quantity to permit regular and ongoing audits of these platform operations to verify that they are operating as described and intended and should include data relevant to: a. the operation of content moderation programs; b. sponsorship of political advertisement; and c. content-ordering techniques, including recommendation and prioritization algorithms. Privacy (k-anonymity and differential privacy), security. All users of the archive should be under a contractual obligation to avoid all attempts to reidentify the individuals involved and should be subject to suspension of their access to the complaint archive for violation of the prohibition on reidentification, commercial secrets.

Continued and improved public disclosure of the operation of platform content moderation
programs, including:
a. Content rules in terms of service or community standards;
b. Enforcement techniques such as deleting, demoting or delaying content;
c. Procedures for the public to complain about possible rules violations;
d. Procedures for platforms to explain their decisions to affected parties; and
e. Procedures for individual appeals in connection with enforcement actions.
2. Continued and enhanced reports to government agencies and to the public with aggregate
statistics accurately reflecting the operation of the content moderation programs.
3. Technical terms of reference of algorithms used in content moderation, prioritization and
recommendation.
4. Greatly improved access to platform data for qualified independent researchers and regulators.
Access to information must be in a form and quantity to permit regular and ongoing audits of
these platform operations to verify that they are operating as described and intended and
should include data relevant to:
a. the operation of content moderation programs;
b. sponsorship of political advertisement; and
c. content-ordering techniques, including recommendation and prioritization algorithms.
[This entry was sourced with minor edits from the Carnegie Endowment’s Partnership for Countering Influence Operations and its baseline datasets initiative. You can find more information here: https://ceip.knack.com/pcio-baseline-datasets]