Skip Navigation
Resource

Recommendations to the Aspen Institute Commission on Information Disorder

On August 19, the Brennan Center submitted key recommendations to combat disinformation to the Aspen Institute Commission on Information Disorder.

Published: August 20, 2021

Recommendations to the Aspen Institute Commission on Information Disorder

The Brennan Center for Justice at NYU School of Law is a nonpartisan law and policy institute that works to reform, revitalize, and defend our country’s systems of democracy and justice. The Brennan Center is dedicated to protecting the rule of law and the values of constitutional democracy. We focus on voting rights, campaign finance reform, ending mass incarceration, and preserving our liberties while also maintaining our national security. We have longstanding expertise in crafting policies to make elections free, fair, and accessible, and have been studying deceptive election influence operations for several years. This includes disinformation about how to vote and other election processes intended to suppress votes or change election outcomes; messages from political actors, some foreign, trying influence elections while hiding their identities; and the successes and failures of media platforms in addressing these challenges. We are also pay close attention to how law and policies impact marginalized communities and dissent.

We appreciate the opportunity to participate in the important work of the Aspen Institute’s Commission on Information Disorder. Here, we discuss key recommendations as relevant to each of the Commission’s priorities.

1. Reducing Harms

Some of the most egregious forms of disinformation should be addressed at the source. For instance, certain ways that bad actors use disinformation to interfere with the right to vote or threaten and intimidate election workers should not be allowed at all, and reforms can reduce incentives for some spreaders of disinformation.

Lies about voting that are designed to trick people out of using the franchise, like “Text # to vote,” threaten the right to vote and disproportionately target communities of color. The federal Deceptive Practices and Voter Intimidation Prevention Act would ban these lies and dedicate law enforcement resources to protecting voters.footnote1_S2osPM59nmJDdmHiGwMXSWOallEEGOB-ZdxAj43GauU_zj7UbGLKupWS1Ian Vandewalker, Digital Disinformation and Vote Suppression, Brennan Center for Justice, 2020, https://www.brennancenter.org/our-work/research-reports/digital-disinformation-and-vote-suppression. The Deceptive Practices Act is included in the omnibus democracy reform bill, the For the People Act (H.R.1 / S.1), passed by the U.S. House of Representatives this year.

Much of the disinformation surrounding the 2020 election was generated by political actors using “Stop the Steal” and similar election-fraud narratives. Trump and others sent fundraising solicitations implying that donations would go to legal challenges to the election results, when in reality the money will largely be dedicated to future political campaigns. Campaign finance laws should restrict these types of appeals, requiring, for instance, that funds raised ostensibly for legal fees to challenge results can only be used for legal challenges.footnote2_amnfWXyQy5j1QaiLr1dFYBrEnl2YLsfyA-TCGJEaFfM_dPeXXRxq14242Election Officials Under Attack, Brennan Center for Justice & Bipartisan Policy Center, 2021, 14, https://www.brennancenter.org/our-work/policy-solutions/election-officials-under-attack

Party-appointed monitors of ballot counts were a significant source of false information about the process in 2020, some inadvertently. States should pass laws requiring these monitors to be trained about election processes, so they know what they are observing, and held accountable for their behavior so that they do not disrupt vote counts and other election work.footnote3_LuB2mF2kGhs9TH4CdOh55PARquIEuoEUJI4WSwDxFc_djg7WlJuzy7C3Id. at 13–14.

Social media platforms have vast power to help to slow the spread of disinformation but have not done enough. Disinformation about elections is particularly harmful, infringing on voting rights, undermining faith in elections, and threatening the foundations to our democracy. Thanks largely to lies spread by figures with large followings, election officials are increasingly being threatened for doing their jobs. One in three fear for their safety.footnote4_bENBzj4aUTuoQ7O-rfbZ6ppAXVvSdYUMdRTqquFOB4_xLOnmSjdWVwC4Id. at 6. The worst spreaders of dangerous election disinformation are readily identifiable. If social media companies are not willing to take down those accounts, one intermediate sanction would be to delay publication of all of posts by accounts that have violated clearly articulated rules and been warned by the platform.footnote5_QVHKY0Tg1GTnO-a1oEOjx5-qnTwc0zY4mIELAxf6CSM_o21Zm7U0C0t45Id. at 12. This “holding area” strategy would act to disrupt the virality of certain claims and would give platform fact-checkers the chance to block some lies before they are published.

In addition, platforms should moderate content from public figures in accord with their power to influence people — meaning more strictly than the average user, not less, as Facebook and others currently do.footnote6_h4krn8pQjwbiM0pfxSB3eSruNIfUQ9vIBKO8Qn6WGk_uOHVNcZO3fto6Ángel Díaz and Laura Hecht-Felella, Double Standards in Social Media Content Moderation, Brennan Center for Justice, 2021, 24–25, https://www.brennancenter.org/our-work/research-reports/double-standards-social-media-content-moderation. And social media platforms should track disinformation and actively send corrective information to users who interacted with false messages about how to vote and other election processes.footnote7_e7h1ne0GnliPLCS9yQxKNlBNhMnrqMlU34j5z88ANwY_cLvXy6XTnX7x7Election Officials Under Attack, 12.

End Notes

2. Increasing Transparency and Understanding

Researchers, civil society, and journalists need access to data from social media platforms and other internet companies about disinformation and the ways it spreads, to inform policy design as well as individual consumer choices. We have proposed that Congress establish a federal commission representing a broad set of stakeholders to investigate how to best facilitate the disclosure of platform data to enable independent research and auditing, protect privacy, and establish whistleblower protections.footnote1_AFKXq3CPyfmxQ-Oehu2LqYMFGuzQ69OkE5i1Dc8qtV0_nuG7WsM1D48y1Díaz and Hecht-Felella, Double Standards, 22–23.

Another mechanism of transparency that will help the industry and government address disinformation would be robust participation in an information sharing and analysis center or organization (ISAC or ISAO).footnote2_Tm7wx-79JesGT6OZCRiIXRESHVtNoflNpa53dkBT00_tzGjB9gwqKUi2Election Officials Under Attack, 13. Such a collaborative entity could collect and analyze aggregate data about emerging threats for immediate action and also study trends over a longer time frame to inform policy, while protecting privacy of individual data.

Most immediately, the platforms should improve transparency around their content moderation policies by publishing clear rules in one place, and they should share data about how those policies are enforced.footnote3_tlBG2V4LDQQUTSbJ4X38uI6oWXoSESw2ODCgpX2J22g_r6NwpSeIVY5b3Díaz and Hecht-Felella, Double Standards, 21.

End Notes

3. Building Trust

Bad actors have had enormous success disseminating election disinformation, while many truth-tellers struggle to find an audience. Social media platforms should amplify authorized voices on election matters as much as possible. We have proposed and are working to implement an innovative solution: an authoritative government directory of election officials nationwide that platforms can use to push out accurate messages from knowledgeable officials.footnote1_5QjQAkNcsQ6XM8CjIM11CmKs-uWtVWO4936OyKcx1Rc_jqFOE4wIfFno1Election Officials Under Attack, 11–12. Officials listed in the directory, for example, could get free advertising credits or be featured in labels on false posts and voter information products.

Of course, all levels of government should be informing the public about voting and election processes to improve resilience against disinformation. We have proposed several avenues, including disinformation navigators to assist resource-starved local officials, rumor control pages to debunk and pre-bunk trending myths, and civics and media literacy programs, among others.footnote2_fOJF0lkwlHrSmNzAqDih-aBLRWvMMCLtxiry0zSvxRQ_nstzeKXz94zH2Election Officials Under Attack, 13–15; Vandewalker, Digital Disinformation.

Finally, traditional measures of election transparency are important. Democracy functions best when voters know who is trying to influence them so they can evaluate how much they trust the messages they encounter. The public should know whether attempts to influence elections are coming from foreign agents, partisan operatives, or special interests protecting their bottom line. To that end, secret spending by “dark money” groups should be eliminated, foreign election spending should be restricted, and other forms of political activity by foreign actors should be fully transparent.footnote3_XsZbWrPekr6-X1XCv1kvFamK2TYHD6hmX7sDmh44xM_mSwM8x80wzal3Limiting Foreign Meddling in U.S. Campaigns, Brennan Center for Justice, 2019, https://www.brennancenter.org/our-work/analysis-opinion/limiting-foreign-meddling-us-campaigns.

End Notes