June 21, 2023
Thomas Struett, Adam Zable, and Susan Ariel Aaronson, Ph.D.
Subject tag: Algorithmic systems | Algorithmic transparency | Data Access | Data governance | Data mapping | Data policy
We live in an era of data dichotomy. On one hand, variants of
generative AI such as Chat-GPT have made large data sets ever
more valuable and visible.1 AI developers rely on these large
data sets to “train” AI systems about the world and to influence
how these systems respond to user prompts and questions. On
the other hand, AI designers, developers and deployers know
that AI models are only as good as the data used to train them.
Yet AI designers, developers, and deployers do not put much
effort into ensuring that datasets are complete, consistent,
verifiable, and usable.
Moreover, without effective rules and frameworks for
governance, data – and thus AI – will never fully meet its
potential to help researchers and policymakers solve problems
This report summarizes our third iteration of findings for the
Global Data Governance Mapping Project, which began in 2020