Skip Navigation

Recommendations to the Aspen Institute Commission on Information Disorder

On August 19, the Brennan Center submitted key recommendations to combat disinformation to the Aspen Institute Commission on Information Disorder.

Published: August 20, 2021

Recommendations to the Aspen Institute Commission on Information Disorder

The Brennan Center for Justice at NYU School of Law is a nonpartisan law and policy institute that works to reform, revitalize, and defend our country’s systems of democracy and justice. The Brennan Center is dedicated to protecting the rule of law and the values of constitutional democracy. We focus on voting rights, campaign finance reform, ending mass incarceration, and preserving our liberties while also maintaining our national security. We have longstanding expertise in crafting policies to make elections free, fair, and accessible, and have been studying deceptive election influence operations for several years. This includes disinformation about how to vote and other election processes intended to suppress votes or change election outcomes; messages from political actors, some foreign, trying influence elections while hiding their identities; and the successes and failures of media platforms in addressing these challenges. We are also pay close attention to how law and policies impact marginalized communities and dissent.

We appreciate the opportunity to participate in the important work of the Aspen Institute’s Commission on Information Disorder. Here, we discuss key recommendations as relevant to each of the Commission’s priorities.

1. Reducing Harms

Some of the most egregious forms of disinformation should be addressed at the source. For instance, certain ways that bad actors use disinformation to interfere with the right to vote or threaten and intimidate election workers should not be allowed at all, and reforms can reduce incentives for some spreaders of disinformation.

Lies about voting that are designed to trick people out of using the franchise, like “Text # to vote,” threaten the right to vote and disproportionately target communities of color. The federal Deceptive Practices and Voter Intimidation Prevention Act would ban these lies and dedicate law enforcement resources to protecting voters.footnote1_nYKEXshefpPB1Ian Vandewalker, Digital Disinformation and Vote Suppression, Brennan Center for Justice, 2020, The Deceptive Practices Act is included in the omnibus democracy reform bill, the For the People Act (H.R.1 / S.1), passed by the U.S. House of Representatives this year.

Much of the disinformation surrounding the 2020 election was generated by political actors using “Stop the Steal” and similar election-fraud narratives. Trump and others sent fundraising solicitations implying that donations would go to legal challenges to the election results, when in reality the money will largely be dedicated to future political campaigns. Campaign finance laws should restrict these types of appeals, requiring, for instance, that funds raised ostensibly for legal fees to challenge results can only be used for legal challenges.footnote2_n0Ty0rVh1Usj2Election Officials Under Attack, Brennan Center for Justice & Bipartisan Policy Center, 2021, 14,

Party-appointed monitors of ballot counts were a significant source of false information about the process in 2020, some inadvertently. States should pass laws requiring these monitors to be trained about election processes, so they know what they are observing, and held accountable for their behavior so that they do not disrupt vote counts and other election work.footnote3_nWM6IqmFIFhe3Id. at 13–14.

Social media platforms have vast power to help to slow the spread of disinformation but have not done enough. Disinformation about elections is particularly harmful, infringing on voting rights, undermining faith in elections, and threatening the foundations to our democracy. Thanks largely to lies spread by figures with large followings, election officials are increasingly being threatened for doing their jobs. One in three fear for their safety.footnote4_hMRt5FbWRlQ64Id. at 6. The worst spreaders of dangerous election disinformation are readily identifiable. If social media companies are not willing to take down those accounts, one intermediate sanction would be to delay publication of all of posts by accounts that have violated clearly articulated rules and been warned by the platform.footnote5_v7l8J75k6P9w5Id. at 12. This “holding area” strategy would act to disrupt the virality of certain claims and would give platform fact-checkers the chance to block some lies before they are published.

In addition, platforms should moderate content from public figures in accord with their power to influence people — meaning more strictly than the average user, not less, as Facebook and others currently do.footnote6_lsPv4rtV01FH6Ángel Díaz and Laura Hecht-Felella, Double Standards in Social Media Content Moderation, Brennan Center for Justice, 2021, 24–25, And social media platforms should track disinformation and actively send corrective information to users who interacted with false messages about how to vote and other election processes.footnote7_sZHDV4QusLP67Election Officials Under Attack, 12.

End Notes

2. Increasing Transparency and Understanding

Researchers, civil society, and journalists need access to data from social media platforms and other internet companies about disinformation and the ways it spreads, to inform policy design as well as individual consumer choices. We have proposed that Congress establish a federal commission representing a broad set of stakeholders to investigate how to best facilitate the disclosure of platform data to enable independent research and auditing, protect privacy, and establish whistleblower protections.footnote1_sAFEZIznlDUQ1Díaz and Hecht-Felella, Double Standards, 22–23.

Another mechanism of transparency that will help the industry and government address disinformation would be robust participation in an information sharing and analysis center or organization (ISAC or ISAO).footnote2_izaO6bM7Ybcf2Election Officials Under Attack, 13. Such a collaborative entity could collect and analyze aggregate data about emerging threats for immediate action and also study trends over a longer time frame to inform policy, while protecting privacy of individual data.

Most immediately, the platforms should improve transparency around their content moderation policies by publishing clear rules in one place, and they should share data about how those policies are enforced.footnote3_vCNVhF4wxH7x3Díaz and Hecht-Felella, Double Standards, 21.

End Notes

3. Building Trust

Bad actors have had enormous success disseminating election disinformation, while many truth-tellers struggle to find an audience. Social media platforms should amplify authorized voices on election matters as much as possible. We have proposed and are working to implement an innovative solution: an authoritative government directory of election officials nationwide that platforms can use to push out accurate messages from knowledgeable officials.footnote1_kayFqWu5wzBg1Election Officials Under Attack, 11–12. Officials listed in the directory, for example, could get free advertising credits or be featured in labels on false posts and voter information products.

Of course, all levels of government should be informing the public about voting and election processes to improve resilience against disinformation. We have proposed several avenues, including disinformation navigators to assist resource-starved local officials, rumor control pages to debunk and pre-bunk trending myths, and civics and media literacy programs, among others.footnote2_f1abT3J1KXeY2Election Officials Under Attack, 13–15; Vandewalker, Digital Disinformation.

Finally, traditional measures of election transparency are important. Democracy functions best when voters know who is trying to influence them so they can evaluate how much they trust the messages they encounter. The public should know whether attempts to influence elections are coming from foreign agents, partisan operatives, or special interests protecting their bottom line. To that end, secret spending by “dark money” groups should be eliminated, foreign election spending should be restricted, and other forms of political activity by foreign actors should be fully transparent.footnote3_r6dN29FV1XAs3Limiting Foreign Meddling in U.S. Campaigns, Brennan Center for Justice, 2019,

End Notes