Report of the Facebook Transparency Advisory Group

Research report

Website/link: https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf

Website/link: Visit website

Published date: April 1, 2019

Author: Ben Bradford, Florian Grisel, Tracey Meares, Emily Owens

Subject tag: Advertising | Data Access | Privacy and data protection | Terrorism and violent extremism

Advisory group commissioned by Facebook to evaluate the company’s Community Standards Enforcement Report. Recommendations include: Prioritize releasing accuracy rates for both human and automated decisions. Release reversal rates from appeals separately. Rates of reversal on appeal should be made public, but they should not stand in as the sole public metric of accuracy. Share statistics on human reviewers’ inter-rater reliability, which differs from measures of accuracy calculated via the review of human decisions referenced above. Check reviewers’ judgments not only against an internal ‘correct’ interpretation of the Standards, but also against a survey of users’ interpretations of the Standards.Report prevalence two ways: (1) Number of violating posts as a proportion of the total number of posts; (2) Number of views of violating posts as a proportion of all views. In V1 and V2 of the CSER Facebook reported only the second metric. Explore ways of relating prevalence metrics to real-world harm. E.g., Is an increase in prevalence of hate speech posts correlated with an increase in ethnic violence in the region; or an increase in removals of hate speech posts correlated with a decrease in ethnic violence? Explore ways of accounting for the seriousness of a violation in the prevalence metrics. Global terrorism content, for example, may or may not include graphic violent imagery. Report prevalence measures in sub-populations, e.g., specific geographic regions or languages. Break out actioned content measures by type of action taken (e.g., content taken down, content covered with warning, account disabled). Report actioned content as a proportion of total estimated violating content. Explore ways of accounting for changes in the Standards and changes in technology when reporting metrics in CSER. Explore ways to enhance bottom-up (as opposed to top-down) governance. These models are described in more depth in Part IV.A. We identify a number of specific ways Facebook could build elements of procedural justice (participation and voice, fairness, conveying trustworthy motives, treating people with respect and dignity) into its process for Community Standards enforcement. For the sake of transparency, we recommend Facebook explore ways of releasing anonymized and aggregated versions of the data upon which the metrics in the CSER are based. This would allow external researchers to verify Facebook’s representations. We identify a number of specific ways Facebook could modify the formatting, presentation, and text of the CSER documents to make them more accessible and intelligible to readers. Pg. 7-9
[This entry was sourced with minor edits from the Carnegie Endowment’s Partnership for Countering Influence Operations and its baseline datasets initiative. You can find more information here: https://ceip.knack.com/pcio-baseline-datasets]