Large-scale automated systems have tremendous influence over our day-to-day lives — from facial recognition technologies to targeted advertising to hiring algorithms to automated scoring in education. Elements of this control can seem downright dystopian, reminding us just how intertwined algorithmic systems are with almost every element of society. These decisions, made on our behalf by often unexplainable or opaque computing techniques, raise concerns over one’s ability to maintain their autonomy and agency as well as calls into question how we experience the world.
At the core of the algorithmic bias critique is whether those who have the power to create, enable, and disseminate such systems do so equitably. Research suggests that it is far from the case. Take for example research on an algorithm widely used in US hospitals to allocate health care to patients, which found that it had been systematically discriminating against Black people based on an assumed signal of total health-care costs accrued in one year. This merely scratches the surface of case studies found across various verticals and domains. As such, we have seen a growing number of interventions — from technical solutions to regulatory propositions — that are attempting to mitigate potential harmful effects of algorithmic bias. For a longer analysis on this topic, we invite you to read our new white paper.
To better understand the impact of algorithmic bias and to elevate research like the study cited above, Aspen Digital has launched Data Stewardship for Good. The initiative is part of the work of the Global Inclusive Growth Partnership (GIGP), a collaboration between the Aspen Institute and the Mastercard Center for Inclusive Growth. Applying a values-based lens focused on fairness, accountability, and transparency, our team’s North Star throughout this process has been the question, “How can data and algorithmic operators recast our own models to forecast a different future, one that centers on the needs of the most vulnerable people?” The result is that we are beginning to articulate a Digital Bill of Rights in support of low-income, historically excluded communities that may be impacted by such automated systems. This thinking is detailed below.
Digital Bill of Rights In Support of Impacted Communities
The entities and institutions that gather and sell personal data hold the primary obligations to keep that data secure, use it fairly, and address any inequities. As such, below details digital rights in support of impacted communities that organizations should recognize and act on in their commitment to data stewardship for good.
The Right To
- Clear, Transparent, and Accessible Information.
- Offerings Designed to Be Inclusive and Avoid Inappropriate Bias.
- Not Be Subjected to Automated Algorithmic Decision-Making When It Relates to Life-Changing Effects on Consumers’ Financial Status, Employment, Health and/or Education.
- Easily Correct Inaccurate and/or Incomplete Information Used by Automated Decision-Making Systems when Creating User Profiles.
- Privacy, with Minimal Data Collection Limited Only to Information Necessary to Provide Goods or Services Sought.
- Know When and How Personal Data is Being Gathered and Used.
- Influence Algorithmic Impact Assessments and Audits.
This work is generously supported through the Global Inclusive Growth Partnership (GIGP), a collaboration between the Aspen Institute and the Mastercard Center for Inclusive Growth.
We are not the first to highlight digital inequities in large-scale systems. Research and advocacy on this issue spans decades. Our process encompasses conducting an extensive literature review, holding in-depth interviews with scholars and experts spanning disciplines, and convening key cross-sector stakeholders to discuss current findings and potential solutions. One such solution is the Algorithmic Bias Playbook geared toward C-suite leaders, technical teams, and public policy-makers. In our own effort, the Digital Bill of Rights reflects concerns and feedback we have received that prioritizes and centers on the needs of the most vulnerable people in our communities.
We recognize that this list is neither comprehensive nor complete, so we ask for further input as we refine the work and continue dialogues on digital rights. We also recognize that an individual rights-framework comes with its own limitations; and we urge those interested in rethinking this model to contact us. Again, our mission is to elevate the voices and ideas that have long pursued inclusion, equity, and justice, particularly in the digital realm.
To submit comments and/or suggestions to the Digital Bill of Rights, please fill out this form.
For more information and/or questions on Data Stewardship for Good, feel welcome to contact Kristine Gloria, Aspen Digital’s Director of Artificial Intelligence, at email@example.com.