Aspen Institute’s Commission on Information Disorder Makes Recommendations to Address the Mis- and Disinformation Crisis

November 15, 2021

Challenges covered include systemic issues that create fertile ground for disinformation, accountability mechanisms for “superspreaders” of lies, and modifications to Section 230 of the Communications Decency Act

 

Contact:
Carner Derron
Marketing & Communications Manager
Aspen Digital
Carner.Derron@aspeninstitute.org

 

Washington, D.C., November 15, 2021 – The Aspen Institute’s Commission on Information Disorder is making 15 recommendations to help government, private industry, and civil society advance solutions to and reduce the greatest harms in America’s urgent mis- and disinformation crisis. Among many other critical challenges, the ambitious report covers legislative and executive action on transparency, disclosure, and platform immunity; the collapse of local journalism; community-led methods for resisting imbalances of power further propagated by bad actors; and accountability mechanisms for “superspreaders” of lies.

Published in the Commission’s Final Report, launched today, the recommendations together aim to increase transparency and understanding, build trust, and reduce harms. A summary of each is provided at the end of this press release, along with a list of the commissioners.

Co-chaired by award-winning journalist Katie Couric, cybersecurity expert Chris Krebs, and civil rights leader Rashad Robinson, the Commission is composed of experts from a diverse set of backgrounds, qualifications, and lived experiences, who grappled with the trade-offs and competing priorities necessary to fully diagnose the problems and the solutions. The co-chairs serve as authors of the report, supported by contributions from the other commissioners. This Aspen Digital initiative was funded by Craig Newmark Philanthropies.

“The biggest lie of all, which fuels those who spread disinformation, is that this crisis of trust and truth is uncontainable,” said Katie Couric. “Leadership from every sector of society and a clear framework for action is essential. Providing a potential blueprint for change was the primary goal of these recommendations.”

“There are incentive structures that manufacture information disorder to further power and financial objectives,” said Chris Krebs. “If we’re to address these incentives, we must start with transparency into the actors and their business models, and put in place clear rules that govern decisions and behaviors.”

“Lies are dangerous. Every day, tech and media platforms help increase the number of people in the world who believe that racially-targeted attacks are solutions rather than problems,” said Rashad Robinson, President of Color Of Change. “Whether it’s related to health care, voter suppression, or public policy, bad actors freely target people of color and many others with intentionally harmful mis- and disinformation, at dangerous scale. I have been honored to work with our Commission co-chairs, and experts across many disciplines, to develop recommendations and to call for the type of leadership that can solve these urgent problems.”

The commissioners welcome partnerships to advance this effort. For instance, the Aspen Institute’s Tech Policy Hub, with support from Craig Newmark Philanthropies and ex/ante, is today launching a $75,000 Information Disorder Prize Competition to encourage implementation of the Commission’s recommendations.

On Monday, November 15, at 1 p.m. ET, co-chairs Katie Couric, Chris Krebs, and Rashad Robinson will participate in a live, virtual panel discussion of the final recommendations, and answer questions from attendees. Media and the public are invited to register to attend here.

Read the Final Report detailing the recommendations on the Aspen Institute’s website. Those seeking to learn more about the Commission on Information Disorder are invited to visit AspenInfoCommission.org.

RECOMMENDATIONS

What follows is a high-level overview of the final recommendations of the Aspen Institute’s Commission on Information Disorder.

Recommendations to increase transparency

Public interest research

  1. Implement protections for researchers and journalists who violate platform terms of service by responsibly conducting research on public data of civic interest. 
  2. Require platforms to disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.

High reach content disclosure
Create a legal requirement for all social media platforms to regularly publish the content, source accounts, reach and impression data for posts that they organically deliver to large audiences.

Content moderation platform disclosure
Require social media platforms to disclose information about their content moderation policies and practices, and produce a time-limited archive of moderated content in a standardized format, available to authorized researchers.

Ad transparency
Require social media companies to regularly disclose, in a standardized format, key information about every digital ad and paid post that runs on their platforms.

Recommendations to build trust

Truth and transformation
Endorse efforts that focus on exposing how historical and current imbalances of power, access, and equity are manufactured and propagated further with mis- and disinformation — and on promoting community-led solutions to forging social bonds.

Healthy digital discourse
Develop and scale communication tools, networks, and platforms that are designed to bridge divides, build empathy, and strengthen trust among communities.

Workforce diversity
Increase investment and transparency to further diversity at social media platform companies and news media as a means to mitigate misinformation arising from uninformed and disconnected centers of power.

Local media investment
Promote substantial, long-term investment in local journalism that informs and empowers citizens, especially in underserved and marginalized communities.

Accountability norms
Promote new norms that create personal and professional consequences within communities and networks for individuals who willfully violate the public trust and use their privilege to harm the public.

Election information security
Improve U.S. election security and restore voter confidence with improved education, transparency, and resiliency.

Recommendations to reduce harms

Comprehensive federal approach
Establish a comprehensive strategic approach to countering disinformation and the spread of misinformation, including a centralized national response strategy, clearly-defined roles and responsibilities across the Executive Branch, and identified gaps in authorities and capabilities.

Public Restoration Fund
Create an independent organization, with a mandate to develop systemic misinformation countermeasures through education, research, and investment in local institutions.

Civic empowerment
Invest and innovate in online education and platform product features to increase users’ awareness of and resilience to online misinformation.

Superspreader accountability
Hold superspreaders of mis- and disinformation to account with clear, transparent, and consistently applied policies that enable quicker, more decisive actions and penalties, commensurate with their impacts — regardless of location, or political views, or role in society.

Amendments to Section 230 of the Communications Decency Act of 1996

  1. Withdraw platform immunity for content that is promoted through paid advertising and post promotion. 
  2. Remove immunity as it relates to the implementation of product features, recommendation engines, and design.

COMMISSIONERS

Below is a list of the members of the Aspen Institute’s Commission on Information Disorder.

  • Marla Blow, President & Chief Operating Officer, Skoll Foundation
  • Katie Couric (co-chair), Journalist and Founder, Katie Couric Media
  • Dr. Aaron D. Ford, Nevada Attorney General
  • Yasmin Green, Director, Research and Development, Jigsaw
  • Prince Harry, The Duke of Sussex, Co-Founder, Archewell
  • Will Hurd, Former Congressman of Texas
  • Jameel Jaffer, Executive Director, Knight First Amendment Institute, Columbia University
  • Chris Krebs, Founding Director, Cybersecurity and Infrastructure Security Agency; Senior Newmark Fellow in Cybersecurity Policy, Aspen Digital; Co-Founder, Krebs Stamos Group
  • Dr. Herb Lin, Senior Research Scholar & Research Fellow, Stanford University
  • Kathryn Murdoch, Co-Founder & President, Quadrivium Foundation
  • Dr. Safiya Umoja Noble, Co-Founder & Co-Director, UCLA Center for Critical Internet Inquiry
  • Rashad Robinson, President, Color Of Change
  • Prof. Deb Roy, Professor, MIT; Director, MIT Center for Constructive Communication; Co-Founder & Chair, Cortico
  • Alex Stamos, Founder, Stanford Internet Observatory
  • Dr. Kate Starbird, Associate Professor, Department of Human Centered Design and Engineering, University of Washington
  • Amanda Zamora, Co-Founder & Publisher, The 19th

###

 

Aspen Digital empowers policy-makers, civic organizations, companies, and the public to be responsible stewards of technology and media in the service of an informed, just, and equitable world. A program of the Aspen Institute, we shine a light on urgent global issues across cybersecurity, the information ecosystem, emerging technology, the industry talent pipeline, tech and communications policy, and innovation. We then turn ideas to action and develop human solutions to these digital challenges.

The Aspen Institute is a global nonprofit organization committed to realizing a free, just, and equitable society. Founded in 1949, the Institute drives change through dialogue, leadership, and action to help solve the most important challenges facing the United States and the world. Headquartered in Washington, DC, the Institute has a campus in Aspen, Colorado, and an international network of partners. For more information, visit www.aspeninstitute.org.

View Comments
0