Information Disorder: A Crisis That Exacerbates All Other Crises

The biggest lie of all, which this crisis thrives on, and which the beneficiaries of mis- and disinformation feed on, is that the crisis itself is uncontainable. One of the corollaries of that mythology is that, in order to fight bad information, all we need is more (and better distributed) good information. In reality, merely elevating truthful content is not nearly enough to change our current course. There is an incentive system in place that manufactures information disorder, and we will not address the problem if we do not take on that system, nor will we improve if we fail to address the larger societal issues that continue to divide us

If we want to reduce information disorder, there are structural changes that we can and must make to our information ecosystem, and there are rules that we can and must implement to better govern the decisions and behavior of information platforms and propagators.

This report is the culmination of an in-depth investigation aimed at better defining the causes and challenges of information disorder, and offering a viable framework for action. We wish to express our profound appreciation for the expertise, insight, and enthusiastic participation of every commissioner, expert, academic, activist, and practitioner who supported our work, and to Craig Newmark Philanthropies, who funded this effort. The Aspen Institute’s Commission on Information Disorder invited voices from across our society to help build upon our understanding of the issues and our approach to recommendations. This included numerous examples of research, original ideas, draft legislation, and critical analysis from academics, policymakers, and activists—all leveraging deep, real-world experience while striving to meet the scale of the challenge.

Each recommendation that follows represents a discrete, actionable idea. Though not all of the recommendations are mutually dependent, they should be considered together—they reinforce and build off one another. For instance, recommendations calling for access and disclosure support those that impose greater accountability for bad actors and, conversely, create a check on overreach.

Our recommendations cover multiple areas: technology, society, government, and media. It is also important to note that, with imperfect information, we make imperfect decisions. Due to the opacity of tech and media platforms—how they operate and how they optimize their products—we do not have sufficient understanding of all the coordinated levers that could reduce societal harms while still allowing for innovation, and both individual and community benefit.

This crisis demands urgent attention and a dedicated response from all parts of society. Every type and level of leader must think seriously about this crisis and their role in it. Each can and should enter this conversation, genuinely listening to the problems and taking real ownership of solutions. Our Commission has aimed to model that process and demonstrate the utility of its outcomes.

We hope that the decision-makers who are ready to take on that challenge will use this framework for action to help reduce information disorder and lessen its destructive role in our world. The Commission hopes its work will spark a new level of leadership and the immediate action that leadership makes possible.

….

Summary of the Recommendations
Recommendations to Increase Transparency
Public interest research (p.32) 1. Implement protections for researchers and journalists who violate platform terms of service by responsibly conducting research on public data of civic interest. 2. Require platforms to disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.

High reach content disclosure (p.35) Create a legal requirement for all social media platforms to regularly publish the content, source accounts, reach and impression data for posts that they organically deliver to large audiences.

Content moderation platform disclosure (p.37) Require social media platforms to disclose information about their content moderation policies and practices, and produce a time-limited archive of moderated content in a standardized format, available to authorized researchers.

Ad transparency (p.40) Require social media companies to regularly disclose, in a standardized format, key information about every digital ad and paid post that runs on their platforms.

Truth and transformation (p.43) Endorse efforts that focus on exposing how historical and current imbalances of power, access, and equity are manufactured and propagated further with mis- and disinformation—and on promoting community-led solutions to forging social bonds.

Healthy digital discourse (p.46) Develop and scale communication tools, networks, and platforms that are designed to bridge divides, build empathy, and strengthen trust among communities.

Workforce diversity (p.49) Increase investment and transparency to further diversity at social media platform companies and news media as a means to mitigate misinformation arising from uninformed and disconnected centers of power.

Local media investment (p.51) Promote substantial, long-term investment in local journalism that informs and empowers citizens, especially in underserved and marginalized communities

Accountability norms (p.54) Promote new norms that create personal and professional consequences within communities and networks for individuals who willfully violate the public trust and use their privilege to harm the public

Election information security (p.58) Improve U.S. election security and restore voter confidence with improved education, transparency, and resiliency.

Comprehensive federal approach (p.62) Establish a comprehensive strategic approach to countering disinformation and the spread of misinformation, including a centralized national response strategy, clearly-defined roles and responsibilities across the Executive Branch, and identified gaps in authorities and capabilities.

Public Restoration Fund (p.64) Create an independent organization, with a mandate to develop systemic misinformation countermeasures through education, research, and investment in local institutions.

Civic empowerment (p.66) Invest and innovate in online education and platform product features to increase users’ awareness of and resilience to online misinformation.

Superspreader accountability (p.69) Hold superspreaders of mis- and disinformation to account with clear, transparent, and consistently applied policies that enable quicker, more decisive actions and penalties, commensurate with their impacts— regardless of location, or political views, or role in society

Amendments to Section 230 of the Communications Decency Act of 1996 (p.72) 1. Withdraw platform immunity for content that is promoted through paid advertising and post promotion. 2. Remove immunity as it relates to the implementation of product features, recommendation engines, and design.