Experts Weigh in on Scope of Commission on Information Disorder

April 20, 2021  • Aspen Digital

In an effort to create a framework of scope and priorities for the Commission on Information Disorder, Aspen Digital queried the broader mis and disinformation sector for their expert suggestions. We’ve collated more than 100 responses to our survey, which asked experts how the government should respond to disinformation, the role tech platforms should play in setting regulation policy, the impact of disinformation outside the United States, and more. Respondents work in fields ranging from academia, news media, technology, and philanthropy.  

Aspen Digital is now sharing the anonymized results with the Commission and the public. 

Trust is viewed as a major hurdle by experts from several fields. A decline in trust in American institutions is both a contributing factor in and a victim of information disorder. One participant from the media industry wrote, “Americans no longer trust media organisations to provide any balance—broadcasters demonstrate bias, while the advent of clickbait journalism has destroyed the revenue models for well-researched print journalism. This creates space for disinformation to thrive.” 

In 2019, a Pew Research study showed that the public’s trust in both the federal government, news media, and in their personal circles have decreased. Prior to the pandemic, a majority of the respondents said they trusted medical doctors, researchers and scientists, but trust in scientists and medical researchers declined. In an essay published in 2020, Richard Edelman, CEO of the communications firm Edelman, said “every institution must play its part in restoring facts to their rightful place at the center of public discourse as the essential step to emerging from information bankruptcy.”  

An expert with a background in library science listed their biggest issue as “the decline of education in America, which has led to a decline in critical thought.”  Education and media literacy were mentioned nearly a dozen times by respondents, underlining the variety of concerns the commission will need to consider as they build out their scope of focus.

While the “government” was mentioned most by experts when answering the question regarding which field is best positioned to address the information disorder, most responses focused on the challenges government authorities, such as the Federal Communications Commission, face and the lack of effective regulatory structures in place. One respondent asked for “a multifaceted approach led by government and election officials,”  while a journalism expert believes the reintroduction of the Fairness Doctrine could encourage or incentivize news and social media to “check their facts.”    

“An organization that is too big to fail, is too big to exist. Their scale represents a failure of governance,” shared a nonprofit respondent when asked about the tech platforms’ role in policy setting and the viability of self-governing. “The question implies that a corporate entity could have influence in policy that determines the quality of a basic human right, to live in a healthy democracy that provides fair and representative suffrage,” the comment continued. “The possibility that a seat at the table for these companies could have preferment over the basic rights of the people is absurd and offensive.” Informed opinions regarding the pivotal role social media platforms and Big Tech play in both the spread and response to online disinformation are complicated across fields of expertise. Subramaniam Vincent, the director of the Journalism and Media Ethics program at the Markkula Center for Applied Ethics, wrote for Georgetown University’s Berkeley Center for Religion, Peace and World Affairs, following the January 6 Capitol attack,“by their very design, social media platforms have for a long time offered equal opportunity to sellers of “Big Lies,” conspiracy theories, and political disinformation.” Social media can present opportunities to connect people as well as divide them; important questions remain if they have the ability to regulate themselves when their spaces become disinformation hubs

Many experts who said platforms “can self-govern” indicated in their answers skepticism that self-governance would be effective. They either mentioned a need for regulatory oversight or a government-led incentive to drive effective policies. Said one media expert regarding tech regulatory policy: “…given past performance — and their ubiquity — there is a clear need for some amount of government oversight and accountability.” The Commission will consider  questions regarding transparency, responsibility and accountability that have risen from platform-led and independent research

Information disorder contributes to the pollution of our information ecosystem and poses a significant and growing threat on everything from public health to community cohesions to democracy itself. The risks to groups discriminated against are particularly acute. Framing the scope of the Commission’s work is its first task, to be done within the first 60 days, and will be one of the largest challenges. The insights and recommendations provided by these experts has been critical in setting the stage.