Skip Navigation
Report

Digital Disinformation and Vote Suppression

Summary: Election officials, internet companies, the federal government, and the public must act to defend the 2020 elections against digital disinformation attacks designed to suppress the vote.

Published: September 2, 2020
digital-disinformation-vote-suppression
Adrià Fruitós

Executive Summary and Recommendations

U.S. elec­tions face extreme pres­sure in 2020. The Covid-19 crisis has created new chal­lenges for elec­tion offi­cials and pushed them to make last-minute changes to the voting process, typic­ally with resources that were already stretched thin. Pandemic-related voting changes have become an elec­tion issue them­selves, with polit­ical actors sowing confu­sion for the bene­fit of their party. Bad actors have circu­lated lies to trick certain groups out of voting — and thanks to social media, these decept­ive prac­tices can instantly reach huge numbers of people. Experts warn that foreign powers have learned from Russi­a’s 2016 elec­tion inter­fer­ence efforts and will try to covertly influ­ence the Amer­ican elect­or­ate this year. 

State and local elec­tion offi­cials play a crucial role in defend­ing U.S. elec­tions against these threats and in protect­ing Amer­ican voters from disen­fran­chise­ment due to disin­form­a­tion. Inter­net compan­ies and members of the public can also take action against decept­ive prac­tices, voter intim­id­a­tion, and other forms of digital vote suppres­sion. In all cases, accur­ate inform­a­tion from trus­ted offi­cial sources provides the best anti­dote to disin­form­a­tion about voting. 

Summary Recom­mend­a­tions

Elec­tion offi­cials should:

  1. Develop plans and proced­ures to publi­cize correct­ive inform­a­tion. Make writ­ten plans to push out correct inform­a­tion without repeat­ing false­hoods. Estab­lish chan­nels of commu­nic­a­tion with the public and with key actors like community groups, candid­ates, and the media.
  2. Publi­cize offi­cial sources of accur­ate inform­a­tion to build public trust. Dissem­in­ate inform­a­tion on well-publi­cized sources like websites, emails, advert­ising, and social media accounts that are active and veri­fied by the plat­form.
  3. Protect offi­cial sources from hack­ing and manip­u­la­tion. Secure offi­cial websites and social media accounts from being used to trick voters by imple­ment­ing cyber­se­cur­ity best prac­tices like tight access controls, multi­factor authen­tic­a­tion, and anti-phish­ing proced­ures.
  4. Monitor for disin­form­a­tion. Actively watch for false­hoods about elec­tions, set up ways for the public to report instances of digital disin­form­a­tion, work with inter­net compan­ies, and parti­cip­ate in inform­a­tion-shar­ing networks.
  5. Build rela­tion­ships with communit­ies and media. Perform early public outreach to communit­ies, includ­ing in appro­pri­ate languages, to facil­it­ate commu­nic­a­tion before an incid­ent occurs. Build rela­tion­ships with local and ethnic media.

Inter­net compan­ies should:

  1. Proact­ively provide inform­a­tion about how to vote.
  2. Main­tain clear chan­nels for report­ing disin­form­a­tion.
  3. Take down false inform­a­tion about voting but preserve the data.
  4. Protect offi­cial accounts and websites.
  5. Push correct­ive inform­a­tion to specific users affected by disin­form­a­tion. 

The federal govern­ment should: 

  1. Enact the Decept­ive Prac­tices and Voter Intim­id­a­tion Preven­tion Act.
  2. Share intel­li­gence about incid­ents of disin­form­a­tion and help dissem­in­ate correct inform­a­tion.

Introduction

The Covid-19 pandemic has created unpre­ced­en­ted chal­lenges for elec­tion admin­is­tra­tion. Many state and local offi­cials have had to completely change their plans in order to address new obstacles like stay-at-home orders, expo­nen­tially higher demand for absentee ballots, and the poten­tial short­age of poll work­ers, who tend to be older and thus are at higher risk for severe illness from the virus. Elec­tions that have already taken place in 2020 have seen massive changes, includ­ing drastic increases in voting by mail or absentee ballot. Due to the pandemic, some offi­cials have made signi­fic­ant last-minute changes to the voting process. In some cases, rules have changed back and forth even in the final hours before Elec­tion Day, such as when courts have inter­vened to block changes. foot­note1_y75t­gog 1 Jim Ruten­berg and Nick Corasaniti, “How a Supreme Court Decision Curtailed the Right to Vote in Wiscon­sin,” New York Times, April 13, 2020, https://www.nytimes.com/2020/04/13/us/wiscon­sin-elec­tion-voting-rights.html; and Allan Smith, “Ohio Primary Called Off at Last Minute Because of Health Emer­gency,” NBC News, March 16, 2020, https://www.nbcnews.com/polit­ics/2020-elec­tion/ohio-governor-calls-state-post­pone-tues­day-s-primary-elec­tions-n1160816.  

This dynamic, even chaotic, envir­on­ment has enorm­ous poten­tial to create confu­sion among voters. Key voting inform­a­tion — includ­ing elec­tion dates, polling loca­tions, and mail-in voting rules — are suddenly subject to change. Voters may not learn about such changes in time to comply, or they may receive conflict­ing inform­a­tion and not know which sources to believe.

These factors leave voters more vulner­able to bad actors who use decept­ive prac­tices to spread false inform­a­tion in an attempt to trick people out of voting. In the United States, there is a long history of using such prac­tices to keep certain voters away from the polls. And in recent years, the inter­net and social media plat­forms have increased the threat of vote suppres­sion. For example, a decept­ive tweet can reach millions of read­ers in a matter of minutes. foot­note2_iq3fo2r 2 More Amer­ic­ans get news on social media than from print news­pa­pers. In 2018, one-in-five adults said they often get news on social media. A. W. Geiger, “Key Find­ings About the Online News Land­scape in Amer­ica,” Pew Research Center, Septem­ber 11, 2019, https://www.pewre­search.org/fact-tank/2019/09/11/key-find­ings-about-the-online-news-land­scape-in-amer­ica/. Because of stay-at-home orders and quar­ant­ines due to Covid-19, voters are all the more depend­ent on online inform­a­tion. And even before the pandemic, national secur­ity experts warned repeatedly that foreign powers plan to attack Amer­ican elec­tions. foot­note3_8f20wbn 3 See, e.g., Ken Dilanian, “U.S. Intel Agen­cies: Russia And China Plot­ting To Inter­fere In 2020 Elec­tion,” NBC News, Janu­ary 29, 2019, https://www.nbcnews.com/polit­ics/national-secur­ity/u-s-intel-agen­cies-russia-china-plot­ting-inter­fere-2020-elec­tion-n963896 (“U.S. intel­li­gence agen­cies assess that Russia and China will seek to inter­fere in the 2020 pres­id­en­tial elec­tion . . . .”). There is already evid­ence of the warn­ing’s accur­acy. Shane Harris and Ellen Nakashima, “With a Mix of Covert Disin­form­a­tion and Blatant Propa­ganda, Foreign Adversar­ies Bear Down on Final Phase of Pres­id­en­tial Campaign,” Wash­ing­ton Post, August 21, 2020, https://www.wash­ing­ton­post.com/national-secur­ity/with-a-mix-of-covert-disin­form­a­tion-and-blatant-propa­ganda-foreign-adversar­ies-bear-down-on-final-phase-of-pres­id­en­tial-campaign/2020/08/20/57997b7a-dbf1–11ea-8051-d5f887d73381_story.html

As a result, the risk for voter disen­fran­chise­ment due to disin­form­a­tion — lies spread for a polit­ical purpose — is perhaps higher in 2020 than ever before.

The United States must build systems now to defend the elect­or­ate from disin­form­a­tion about voting proced­ures. The coun­try cannot afford to wait until incid­ents unfold. The offi­cials who run Amer­ican elec­tions are aware of the threats and have been work­ing to safe­guard their systems and keep voters informed. Many agen­cies need more resources, however, and of course all govern­ments are required to make decisions about where to focus the resources avail­able.

This report builds on elec­tion offi­cials’ efforts so far, along with recom­mend­a­tions from secur­ity and commu­nic­a­tions experts, to recom­mend the crucial steps to shield against decept­ive voter suppres­sion and to prepare for fast responses to disin­form­a­tion attacks.

Decept­ive Prac­tices 

There is a multi­tude of stor­ies about attempts to trick certain people out of voting. These decept­ive prac­tices have often involved the use of flyers, mail­ers, and robocalls. foot­note4_ikrl1wl 4 See, e.g., Wendy Weiser and Vishal Agra­harkar, Ballot Secur­ity and Voter Suppres­sion: What It Is and What the Law Says, Bren­nan Center for Justice, 2012, 9, https://www.bren­nan­cen­ter.org/sites/default/files/2019–08/Report_Ballot_Secur­ity_Voter_Suppres­sion.pdf. For example, during Texas’s Super Tues­day primary in March 2020, robocalls falsely told voters that they could vote “tomor­row.” foot­note5_yhf0snf 5 Alia Slisco, “Robocalls Spread­ing Super Tues­day Misin­form­a­tion Through­out Texas,” News­week, March 3, 2020, https://www.news­week.com/robocalls-spread­ing-super-tues­day-misin­form­a­tion-through­out-texas-1490368. Simil­arly, in 2004, flyers distrib­uted in Frank­lin County, Ohio, falsely told voters that Repub­lic­ans should vote on Tues­day and Demo­crats on Wednes­day due to high levels of voter regis­tra­tion. foot­note6_85p46od 6 Wendy Weiser and Adam Gitlin, Dangers of “Ballot Secur­ity” Oper­a­tions: Prevent­ing Intim­id­a­tion, Discrim­in­a­tion, and Disrup­tion, Bren­nan Center for Justice, 2016, 6, https://www.bren­nan­cen­ter.org/our-work/research-reports/dangers-ballot-secur­ity-oper­a­tions-prevent­ing-intim­id­a­tion-discrim­in­a­tion. A related tactic is to intim­id­ate voters with false reports of law enforce­ment pres­ence, immig­ra­tion enforce­ment actions, or elec­tion monit­or­ing by armed indi­vidu­als. foot­note7_nmnjjjs 7 “In 2008 in Phil­adelphia, flyers posted near Drexel Univer­sity incor­rectly warned that police officers would be at polling places look­ing for indi­vidu­als with outstand­ing arrest warrants or park­ing tick­ets.” Weiser and Gitlin, Dangers of “Ballot Secur­ity” Oper­a­tions, 6.

These voter suppres­sion tactics frequently target histor­ic­ally disen­fran­chised communit­ies, includ­ing communit­ies of color, low-income communit­ies, and immig­rant communit­ies. foot­note8_igs127g 8 Common Cause and the Lawyers’ Commit­tee for Civil Rights Under Law, Decept­ive Elec­tion Prac­tices and Voter Intim­id­a­tion: The Need for Voter Protec­tion, 2012, 4, https://lawyer­scom­mit­tee.org/wp-content/uploads/2015/07/Decept­ive­Prac­tices­Re­portJu­ly2012FI­NALpdf.pdf. For example, during Alabama’s U.S. Senate special elec­tion in 2017, resid­ents of Jeffer­son County — where the largest city, Birm­ing­ham, is predom­in­antly African Amer­ican — received text messages with false inform­a­tion about polling site changes. foot­note9_zmh73lh 9 Sean Morales-Doyle and Sidni Fred­er­ick, “Inten­tion­ally Deceiv­ing Voters Should Be a Crime,” The Hill, August 8, 2018, https://thehill.com/opin­ion/civil-rights/400941-inten­tion­ally-deceiv­ing-voters-should-be-a-crime. And on Elec­tion Day in 2010, Mary­land gubernat­orial candid­ate Bob Ehrlich’s campaign manager targeted African Amer­ican house­holds with robocalls claim­ing that Governor Martin O’Mal­ley had already been reelec­ted, imply­ing that his support­ers could stay home instead of voting. foot­note10_e8gw3ed 10 Weiser and Agra­harkar, Ballot Secur­ity and Voter Suppres­sion, 9; and Luke Broad­wa­ter, “Prosec­utors: GOP ‘Rob­ocall’ Plan to Suppress Black Votes Hatched on Hectic Elec­tion Day,” Baltimore Sun, Novem­ber 29, 2011, https://www.baltimore­sun.com/polit­ics/bs-md-shur­ick-trial-20111129-story.html.

Decept­ive elec­tion prac­tices are most commonly used in the last days before an elec­tion because they are presum­ably most effect­ive if spread without time for rebut­tal before voting begins. As a result, the scale and scope of voter suppres­sion tactics for the 2020 elec­tion remain unknown, although recent history suggests disin­form­a­tion will be a signi­fic­ant prob­lem.

New Dangers Online

While dirty tricks in elec­tions are an old phenomenon, in the 21st century decept­ive prac­tices have become more danger­ous than ever before. The contin­ued growth of the inter­net and social media plat­forms has made it easier and more afford­able to reach huge numbers of people instant­an­eously and anonym­ously. Tradi­tion­ally, decept­ive prac­tices involved narrow target­ing by geography, such as with flyers on tele­phone poles in certain neigh­bor­hoods. Now, however, bad actors can use soph­ist­ic­ated microtar­get­ing to surgic­ally focus on certain demo­graph­ics, and they can direct disin­form­a­tion either toward disrupt­ing a specific local elec­tion or toward a national audi­ence. 

Amid rising polar­iz­a­tion and mistrust of insti­tu­tions in recent years, the Cold War–era concept of “disin­form­a­tion” — the inten­tional spread of false inform­a­tion — has regained currency in Amer­ican polit­ics. foot­note11_m3o7drb 11 Caroline Jack, Lexicon of Lies: Terms for Prob­lem­atic Inform­a­tion, Data & Soci­ety Research Insti­tute, 2017, 3, https://data­so­ci­ety.net/pubs/oh/DataAnd­So­ci­ety_Lexicon­ofLies.pdf. Disin­form­a­tion is often distin­guished from misin­form­a­tion, which is false inform­a­tion that is spread without bad intent. Although this report focuses on disin­form­a­tion, the key remedy — dissem­in­at­ing correct inform­a­tion — is the same regard­less of the intent of those who spread false inform­a­tion.  

In recent U.S. elec­tions, bad actors, both foreign and domestic, have attemp­ted to stop certain people from voting by spread­ing false inform­a­tion online. Lead­ing up to the 2016 U.S. pres­id­en­tial elec­tion, oper­at­ives of the Inter­net Research Agency, a Russian company tied to Pres­id­ent Vladi­mir Putin, engaged in voter suppres­sion by posing as Amer­ic­ans and post­ing messages and ads on social media. They used decept­ive prac­tices like direct­ing people to vote by text, which is not a valid option anywhere in the United States. Oper­at­ives also targeted African Amer­ic­ans with messages recom­mend­ing elec­tion boycotts or votes for third-party candid­ates. foot­note12_dzq430f 12 Renee DiResta et al., The Tactics & Tropes of the Inter­net Research Agency, New Know­ledge, 2018, 8, 85, https://disin­form­a­tion­re­port.blob.core.windows.net/disin­form­a­tion-report/NewKnow­ledge-Disin­form­a­tion-Report-White­pa­per.pdf; Kurt Wagner, “These Are Some of the Tweets and Face­book Ads Russia Used to Try and Influ­ence the 2016 Pres­id­en­tial Elec­tion,” Vox, Octo­ber 31, 2017, https://www.vox.com/2017/10/31/16587174/fake-ads-news-propa­ganda-congress-face­book-twit­ter-google-tech-hear­ing (includ­ing examples recom­mend­ing that Clin­ton voters vote by text); and Joseph Bern­stein, “Inside 4chan’s Elec­tion Day Mayhem and Misin­form­a­tion Play­book,” Buzzfeed, Novem­ber 7, 2016, https://www.buzzfeed­news.com/article/joseph­bern­stein/inside-4chans-elec­tion-day-mayhem-and-misin­form­a­tion-play­boo#.hsq0K9­jOjM (same).

In 2018, many social media accounts posted false voting inform­a­tion, includ­ing instruc­tions to vote by text and claims that voters of one party were required to vote the day after Elec­tion Day. foot­note13_nwm0g7w 13 Young Mie Kim, “Voter Suppres­sion Has Gone Digital,” Bren­nan Center for Justice, Novem­ber 20, 2018, https://www.bren­nan­cen­ter.org/our-work/analysis-opin­ion/voter-suppres­sion-has-gone-digital. A candid­ate paid for a Face­book ad that falsely sugges­ted that Kansans would need docu­ment­ary proof of citizen­ship in order to register to vote. foot­note14_a3jt175 14 Kim, “Voter Suppres­sion Has Gone Digital.” Other messages conveyed threats about people bring­ing guns to the polls or law enforce­ment round­ing people up at polling places. foot­note15_xd52qhp 15 Kim, “Voter Suppres­sion Has Gone Digital.” Just days before the 2018 elec­tion, U.S. Immig­ra­tion and Customs Enforce­ment had to publicly address rumors that had spread via flyers and social media by announ­cing that it would not, in fact, conduct enforce­ment oper­a­tions at polling places. foot­note16_tb1019m 16 Center for the Advance­ment of Public Integ­rity, Prosec­ut­ing Vote Suppres­sion by Misin­form­a­tion, 2019, 1, https://web.law.columbia.edu/public-integ­rity/Prosec­ut­ing-Vote-Suppres­sion-By-Misin­form­a­tion.

Digital disin­form­a­tion about voting has spread online during the 2020 pres­id­en­tial primar­ies, even as many voters have relied more than ever on the inter­net for elec­tion inform­a­tion due to public health direct­ives to stay home amid Covid-19. On Super Tues­day, there were instances of disin­form­a­tion across several states — for example, one tweet target­ing support­ers of Kentucky gubernat­orial candid­ate Matt Bevin read, “Bevin support­ers do not forget to vote on Wednes­day, Novem­ber 6th!” foot­note17_8m6d­haz 17 Elec­tion Protec­tion monitored social media for voting disin­form­a­tion and found many messages direct­ing certain voters to vote on Wednes­day. For a preserved screen­shot of the tweet quoted in the text, see https://mono­snap.com/file/fgCcA3ns­v4rdk1IkUIC8xB60TQW­cR0. On the day of an August congres­sional primary in Flor­ida, some voters received texts link­ing to a YouTube video falsely presen­ted as a candid­ates’ announce­ment that he dropped out. foot­note18_b95h­cy0 18 Alex Marquardt and Paul P. Murphy, “Fake Texts and YouTube Video Spread Disin­form­a­tion about Repub­lican Primary Candid­ate on Elec­tion Day,” CNN, August 18, 2020, https://www.cnn.com/2020/08/18/polit­ics/byron-donalds-fake-texts-flor­ida-repub­lican-primary/index.html.  

Bad actors are spread­ing lies about Covid-19 to try to stop people from voting. For example, on Super Tues­day, some circu­lated messages on Twit­ter like this one: “warn­ing that every­one over age 60 that #coronavirus has been repor­ted at ALL polling loca­tions for #Super­Tues­day.” foot­note19_igoh2ob 19 Adam Rawns­ley, “Sick: Trolls Exploit Coronavirus Fears for Elec­tion Fun,” Daily Beast, March 3, 2020, https://www.thedailybeast.com/trolls-exploit-coronavirus-fears-for-super-tues­day-fun.  

Foreign states have attemp­ted to influ­ence the 2020 elec­tions as well, includ­ing through attacks inten­ded to discour­age voters from support­ing specific front-runner candid­ates. foot­note20_0gd2gfs 20 Young Mie Kim, “New Evid­ence Shows How Russi­a’s Elec­tion Inter­fer­ence Has Gotten More Brazen,” Bren­nan Center for Justice, March 5, 2020, https://www.bren­nan­cen­ter.org/our-work/analysis-opin­ion/new-evid­ence-shows-how-russias-elec­tion-inter­fer­ence-has-gotten-more. They have also circu­lated stor­ies push­ing the narrat­ive that Amer­ican elec­tions are rigged. foot­note21_8gclint 21 Samantha Lai, “Russi­a’s Narrat­ives about U.S. Elec­tion Integ­rity in 2020,” Foreign Policy Research Insti­tute, May 25, 2020, https://www.fpri.org/fie/russia-elec­tion-integ­rity-2020/; Samantha Lai, “Iran’s Narrat­ives about U.S. Elec­tion Integ­rity in 2020,” Foreign Policy Research Insti­tute, June 19, 2020, https://www.fpri.org/fie/iran-elec­tion-integ­rity-in-2020/. In late 2019, the FBI and Depart­ment of Home­land Secur­ity (DHS) warned state elec­tion offi­cials that Russia could try to inter­fere in the elec­tion in 2020 by discour­aging voter turnout. foot­note22_7wscx0p 22 Kevin Collier, “Russia Likely to Focus on Voter Suppres­sion in 2020, Feds Warn States,” CNN, Octo­ber 13, 2019, https://www.cnn.com/2019/10/03/polit­ics/russia-voter-suppres­sion-warn­ing/index.html.

Legal Remed­ies 

People who attempt to suppress the vote by spread­ing false inform­a­tion may be viol­at­ing several federal and state laws. foot­note23_9nfuh1o 23 See, e.g., 42 U.S.C. § 1985(3) (provid­ing cause of action if “two or more persons . . . conspire . . . for the purpose of depriving, either directly or indir­ectly, any person or class of persons of the equal protec­tion of the laws, or of equal priv­ileges and immunit­ies under the laws”). Many states prohibit vari­ous forms of voter intim­id­a­tion and elec­tion inter­fer­ence and impose crim­inal or civil penal­ties. foot­note24_e7hl­baw 24 See gener­ally Weiser and Gitlin, Dangers of “Ballot Secur­ity” Oper­a­tions, 1–4. For instance, Virginia imposes crim­inal penal­ties for know­ingly commu­nic­at­ing false elec­tion inform­a­tion to a registered voter: “It shall be unlaw­ful for any person to commu­nic­ate to a registered voter, by any means, false inform­a­tion, know­ing the same to be false, inten­ded to impede the voter in the exer­cise of his right to vote.” Va. Code Ann. § 24.2–1005.1(A). Wiscon­sin simil­arly prohib­its “false repres­ent­a­tion[s] pertain­ing to a candid­ate or refer­en­dum.” Wis. Stat. § 12.05. States have crim­in­ally prosec­uted oper­at­ives involved in spread­ing disin­form­a­tion about voting. For example, the campaign manager and a polit­ical consult­ant for 2010 Mary­land gubernat­orial candid­ate Bob Ehrlich — both of whom were involved in robocalls designed to suppress Black votes — were convicted of offenses includ­ing fraud and fail­ing to identify the source of the calls. foot­note25_d1n8240 25 Gubernat­orial candid­ate Bob Ehrlich’s campaign manager was convicted of elec­tion fraud and fail­ing to identify the source of the calls. John Wagner, “Ex-Ehrlich Campaign Manager Schurick Convicted in Robocall Case,” Wash­ing­ton Post, Decem­ber 6, 2011, https://www.wash­ing­ton­post.com/local/dc-polit­ics/ex-ehrlich-campaign-manager-schurick-convicted-in-robocall-case/2011/12/06/gIQA6rNsaO_story.html. A consult­ant for the campaign was convicted of fail­ing to identify the source of the calls. Luke Broad­wa­ter, “Julius Henson Sentenced to Jail in ‘Rob­ocall’ Case,” Baltimore Sun, June 13, 2012, https://www.baltimore­sun.com/news/bs-xpm-2012–06–13-bs-md-henson-senten­cing-20120613-story.html.  

These exist­ing laws are import­ant but insuf­fi­cient. To be sure, people who spread disin­form­a­tion should be held account­able, and enforce­ment should serve as a deterrent to future miscon­duct. But litig­a­tion and similar meas­ures happen after the elec­tion, and there­fore after any damage to the fran­chise has already been done. Addi­tion­ally, exist­ing laws against decept­ive prac­tices differ in breadth, and enforce­ment can be irreg­u­lar. foot­note26_iy8nket 26 Common Cause and the Lawyers’ Commit­tee for Civil Rights Under Law, Decept­ive Elec­tion Prac­tices, 2–3. Below, this paper recom­mends federal legis­la­tion that would expressly prohibit decept­ive prac­tices and provide for clear sanc­tions and correct­ive action. foot­note27_trn2×0l 27 Decept­ive Prac­tices and Voter Intim­id­a­tion Preven­tion Act of 2019, H.R. 3281, 116th Cong. (2019), included in the For the People Act of 2019, H.R. 1, 116th Cong. (2019).  

Regard­less of whether legal reforms are enacted, there are concrete actions that state and local elec­tion offi­cials, inter­net compan­ies, members of Congress, and ordin­ary citizens can take today that will help protect voters from the effects of decept­ive prac­tices, voter intim­id­a­tion, and other forms of digital vote suppres­sion.

End Notes

Recommendations for Election Officials

State and local elec­tion offi­cials are the most import­ant actors in address­ing the use of disin­form­a­tion to suppress votes. They must estab­lish them­selves as trus­ted sources of accur­ate inform­a­tion for voters in their communit­ies. The most urgent response to instances of disin­form­a­tion to suppress voting is to dissem­in­ate correct inform­a­tion, which is discussed imme­di­ately below. But offi­cials cannot wait for incid­ents to come along. This paper describes crucial prepar­a­tions that should be made long in advance.

1. Develop plans and proced­ures to publi­cize correct­ive inform­a­tion

 The most import­ant way to respond to disin­form­a­tion is to correct it with the truth without help­ing to spread the lie, a task that requires long-term plan­ning and infra­struc­ture build­ing. It is too late to formu­late a plan on Elec­tion Day, or even during early voting peri­ods. Instead, agen­cies should formu­late writ­ten proced­ures that cover what to do and who should do it when decept­ive prac­tices are discovered. foot­note1_6sla­jhu 1 Elec­tion offi­cials should “[d]raft, review, and approve a commu­nic­a­tions plan prior to negat­ive devel­op­ments.” Edgardo Cortés et al., Prepar­ing for Cyber­at­tacks and Tech­nical Prob­lems during the Pandemic: A Guide for Elec­tion Offi­cials, Bren­nan Center for Justice, 2020, 22, https://www.bren­nan­cen­ter.org/our-work/research-reports/prepar­ing-cyber­at­tacks-and-tech­nical-prob­lems-during-pandemic-guide. Relev­ant staff members should receive check­lists and train­ing that includes simu­lated incid­ents or tabletop exer­cises. foot­note2_lau8d49 2 Advice on and examples of check­lists can be found in related elec­tion infra­struc­ture resi­li­ence guid­ance. Bren­nan Center for Justice, “Prepar­ing for Cyber­at­tacks and Tech­nical Prob­lems During the Pandemic: A Check­list for Elec­tion Offi­cials,” 2020, https://www.bren­nan­cen­ter.org/sites/default/files/2020–06/2020_06_Prepar­ing­for­At­tack_Check­list.pdf; and Belfer Center for Science and Inter­na­tional Affairs, Elec­tion Cyber Incid­ent Commu­nic­a­tions Plan Template, 2018, https://www.belfer­cen­ter.org/sites/default/files/files/public­a­tion/Commu­nic­a­tion­sTem­plate.pdf.

The DHS Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency (CISA), offers stand­ard­ized incid­ent response plans, train­ings, and simu­lated disin­form­a­tion scen­arios to assist state and local agen­cies. foot­note3_icff­db3 3 U.S. Depart­ment of Home­land Secur­ity (here­in­after DHS), Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, #Protect2020 Stra­tegic Plan, 2020, 12, https://www.cisa.gov/sites/default/files/public­a­tions/ESI%20Stra­tegic%20Plan_FINAL%202.7.20%20508.pdf. The Belfer Center for Science and Inter­na­tional Affairs has outlined best prac­tices for commu­nic­at­ing with the public to counter false inform­a­tion. foot­note4_la2weux 4 Belfer Center, Elec­tion Cyber Incid­ent Commu­nic­a­tions Plan Template, 11.

Estab­lish chan­nels of commu­nic­a­tion with key actors 

The goal of contin­gency plans is to prepare to most effect­ively correct false inform­a­tion with the truth without help­ing to spread lies. When offi­cials find instances of disin­form­a­tion about voting, they should proact­ively distrib­ute accur­ate inform­a­tion to all the appro­pri­ate actors and chan­nels, includ­ing:

  • elec­tion offi­cials’ websites, social media accounts, and other chan­nels
  • indi­vidu­als staff­ing phone lines and recep­tion areas at elec­tion offices
  • inform­a­tion-shar­ing networks like the voter assist­ance coali­tion Elec­tion Protec­tion
  • media outlets that serve affected communit­ies, includ­ing those in languages other than English
  • community groups and faith lead­ers
  • parties and candid­ates running in the relev­ant elec­tion

Offi­cials should estab­lish these chan­nels of commu­nic­a­tion long before Elec­tion Day. Rela­tion­ships must be built and main­tained over time, not when an emer­gency is already under­way. Estab­lish­ing lines of commu­nic­a­tion early allows elec­tion offi­cials to include points of contact in their plans. That way, when disin­form­a­tion is detec­ted, offi­cials can simply go down the list of contacts and share correc­tions with the relev­ant indi­vidu­als and groups. In addi­tion, ongo­ing commu­nic­a­tion helps inform polit­ical actors of whom to contact to report incid­ents. Community groups, for example, may be the first to discover disin­form­a­tion about voting, and inform­a­tion will be shared most effi­ciently if they know the right elec­tion offi­cial to report it to.

Elec­tion agen­cies should consider naming an indi­vidual offi­cial or a small team to lead inform­a­tion collec­tion, which includes processing reports that are made to an offi­cial email address, phone number, or social media account. foot­note5_60t22fi 5 “Every organ­iz­a­tion, polit­ical campaign, activ­ists and those using these plat­forms for outreach should have a focused disin­form­a­tion arm.” Stop Online Viol­ence Against Women, A Threat to an Amer­ican Demo­cracy: Digital Voter Suppres­sion, 2020, 21, http://stoponlinevaw.com/wp-content/uploads/2020/02/7.pdf. The same offi­cial or team should also monitor social media more broadly for false content or receive direct reports from staff or vendors who are monit­or­ing digital content. foot­note6_yoj9afo 6 Belfer Center for Science and Inter­na­tional Affairs, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 2018, 18, https://www.belfer­cen­ter.org/sites/default/files/files/public­a­tion/State­Lo­c­al­Play­book%201.1.pdf. And a single author­ity should be in charge of shar­ing inform­a­tion about incid­ents with appro­pri­ate networks and inform­ing affected communit­ies of the correct inform­a­tion. Without clear account­ab­il­ity, issues and incid­ents may fall through the cracks. 

Do not repeat false­hoods

Messages from elec­tion offi­cials correct­ing disin­form­a­tion should not repeat false­hoods. Research shows that repeat­ing false­hoods to debunk them can back­fire and make people more likely to remem­ber the false inform­a­tion. foot­note7_guryzaz 7 “One of the most frequently used correc­tion strategies, the myth-versus-fact format, can back­fire because of repe­ti­tion of the myth, leav­ing people all the more convinced that their erro­neous beliefs are correct.” Norbert Schwarz et al., Making the Truth Stick & the Myths Fade: Lessons from Cognit­ive Psycho­logy, Beha­vi­oral Science & Policy Asso­ci­ation, 2016, 86, https://beha­vi­or­al­policy.org/wp-content/uploads/2017/05/BSP_vol1is1_Schwarz.pdf. If offi­cials consider it abso­lutely neces­sary to include the original disin­form­a­tion, they should struc­ture the messages to present accur­ate and easy-to-under­stand inform­a­tion first, warn that the disin­form­a­tion is false before mention­ing it, and repeat the facts. foot­note8_3q56s77 8 Schwarz et al., Making the Truth Stick, 92–93. People are more likely to remem­ber the first and last things they hear, as well as inform­a­tion that’s repeated. 

As an example, during the Texas pres­id­en­tial primary on Super Tues­day, robocalls told voters to vote “tomor­row.” foot­note9_q5rshnn 9 Slisco, “Robocalls Spread­ing Super Tues­day Misin­form­a­tion.” The offi­cial Twit­ter account of the Texas secret­ary of state aler­ted the public that false inform­a­tion was being circu­lated and provided correct inform­a­tion without repeat­ing the false­hood. foot­note10_liw9w1y 10 Texas Secret­ary of State (@TXsecof­state), “Our office has received reports of robocalls stat­ing misin­form­a­tion about today’s primary elec­tion. To be clear, all eligible voters should vote today,” Twit­ter, March 3, 2020, 4:30 p.m., https://twit­ter.com/TXsecof­state/status/1234954361941479424?s=20.

2. Publi­cize offi­cial sources of accur­ate inform­a­tion to build public trust

Elec­tion offi­cials must work to build resi­li­ence to disin­form­a­tion long before Elec­tion Day. Accur­ate inform­a­tion from a trus­ted source provides the most effect­ive shield against decept­ive vote suppres­sion. Accord­ingly, elec­tion agen­cies should build public trust in specific sources of inform­a­tion before disin­form­a­tion attacks occur. foot­note11_3o7t­co0 11 Belfer Center, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 18. They should proact­ively inform the public of key inform­a­tion like dates for voting and encour­age voters to look up polling places and regis­tra­tion status well in advance. 

State and local agen­cies should desig­nate and publi­cize specific sources of inform­a­tion — at a minimum, a phone number and website — as author­it­at­ive. They should provide an offi­cial email address and main­tain social media accounts on plat­forms popu­lar in local communit­ies. foot­note12_w0nk­mo9 12 Univer­sity of Pitt­s­burgh Insti­tute for Cyber Law, Policy, and Secur­ity, The Blue Ribbon Commis­sion on Pennsylvani­a’s Elec­tion Secur­ity: Study and Recom­mend­a­tions, 2019, 52, https://www.cyber.pitt.edu/sites/default/files/FINAL%20FULL%20PittCy­ber_PAs_Elec­tion_Secur­ity_Report.pdf. When using tradi­tional forms of outreach like mail­ers and advert­ising, agen­cies should direct voters to offi­cial digital sources. 

Wherever possible, elec­tion agen­cies and offi­cials should ensure their social media accounts are desig­nated as offi­cial by the asso­ci­ated plat­forms — for example, as a veri­fied or “blue check” account on Twit­ter. foot­note13_r7gr­puf 13 “Elec­tion offi­cials should . . . secure ‘veri­fied’ status for their offi­cial accounts on social media plat­forms.” Ad Hoc Commit­tee for 2020 Elec­tion Fair­ness and Legit­im­acy, Fair Elec­tions During a Crisis: Urgent Recom­mend­a­tions in Law, Media, Polit­ics, and Tech to Advance the Legit­im­acy of, and the Public’s Confid­ence in, the Novem­ber 2020 U.S. Elec­tions, 2020, 20, https://www.law.uci.edu/faculty/full-time/hasen/2020Elec­tion­Re­port.pdf. The proced­ures and require­ments for obtain­ing veri­fied accounts vary depend­ing on the plat­form. foot­note14_mjcu6p5 14 For example, Face­book and Twit­ter both have veri­fied account programs. “How do I request a veri­fied badge on Face­book?” Face­book Help Center, accessed August 14, 2020, https://www.face­book.com/help/1288173394636262; and “Veri­fied Account FAQs,” Twit­ter Help Center, accessed August 14, 2020, https://help.twit­ter.com/en/managing-your-account/twit­ter-veri­fied-accounts. Some local elec­tion offi­cials have had diffi­culty getting veri­fied due to issues such as low follower counts. Social media compan­ies ought to accom­mod­ate local govern­ments, but if they do not, involve­ment by state offi­cials could help. For example, Arizona Secret­ary of State Katie Hobbs has worked with social media compan­ies to get county admin­is­trat­ors’ accounts veri­fied. foot­note15_8j39433 15 Matt Vasi­lo­gam­bros, “How Your Local Elec­tion Clerk Is Fight­ing Global Disin­form­a­tion,” Pew Stateline, July 20, 2020, https://www.pewtrusts.org/en/research-and-analysis/blogs/stateline/2020/07/20/how-your-local-elec­tion-clerk-is-fight­ing-global-disin­form­a­tion.  

Elec­tion agen­cies and offi­cials should not allow their social media accounts to lie dormant, but rather should work actively to gain follow­ers, includ­ing journ­al­ists and community groups. They should main­tain uniform icons, logos, and other brand­ing across all media — includ­ing websites and social media accounts, which should link to each other. foot­note16_6pwsekd 16 Jesse Little­wood, Vice Pres­id­ent for Campaigns, Common Cause, personal commu­nic­a­tion with author, July 24, 2020.  

The National Asso­ci­ation of Secret­ar­ies of State (NASS) initi­at­ive #Trus­ted­In­fo2020 promotes elec­tion offi­cials as reli­able sources of voting inform­a­tion. foot­note17_e3w9s9s 17 “#Trus­ted­In­fo2020,” National Asso­ci­ation of Secret­ar­ies of State website, accessed August 17, 2020, https://www.nass.org/initi­at­ives/trus­ted­info-2020. Elec­tion offi­cials should parti­cip­ate by using the hashtag to help direct audi­ences to their websites and social media accounts. foot­note18_hn7a­j7m 18 Accord­ing to Kathy Boock­var, secret­ary of the Common­wealth of Pennsylvania, “We should continue to focus on initi­at­ives like #Trus­ted­In­fo2020 that promote trus­ted elec­tion sources because there’s so much misin­form­a­tion out there that has the poten­tial to cause disrup­tion or to under­mine the confid­ence of our voters.” Tim Lau, “Last Chance to Secure the 2020 Elec­tions,” Bren­nan Center for Justice, July 27, 2020, https://www.bren­nan­cen­ter.org/our-work/research-reports/last-chance-secure-2020-elec­tions.  

Offi­cials should also find ways to distrib­ute inform­a­tion to relev­ant audi­ences. In Cali­for­nia, for example, offi­cials collec­ted millions of email addresses during the voter regis­tra­tion process, then sent an email to that list direct­ing voters to the offi­cial state elec­tions guide. foot­note19_8mqt7wh 19 Brian Fung, “States Launch ‘Trus­ted Inform­a­tion’ Efforts against Fake News on Social Media,” CNN, March 3, 2020, https://edition.cnn.com/2020/03/02/polit­ics/state-efforts-against-social-media-misin­form­a­tion/index.html. Face­book allows govern­ment offi­cials to send “local alerts” within vari­ous geographic bound­ar­ies to users who follow the offi­cials’ page. “Local Alerts Over­view,” Face­book Busi­ness Help Center, accessed August 17, 2020, https://www.face­book.com/busi­ness/help/1064049677089136?id=1549080658590154. Illinois offi­cials bought YouTube ads point­ing view­ers to county elec­tion websites. foot­note20_n6gt­f5f 20 Vasi­lo­gam­bros, “How Your Local Elec­tion Clerk Is Fight­ing Global Disin­form­a­tion.” Offi­cials should make inform­a­tion access­ible to members of the community who speak languages other than English and to people with disab­il­it­ies. foot­note21_2841iu6 21 Wendy Weiser and Max Feld­man, How to Protect the 2020 Vote from the Coronavirus, Bren­nan Center for Justice, 2020, 10, https://www.bren­nan­cen­ter.org/our-work/policy-solu­tions/how-protect-2020-vote-coronavirus. Finally, offi­cials should frequently review and update digital mater­i­als to help ensure their accur­acy.

3. Protect offi­cial sources from hack­ing and manip­u­la­tion

The poten­tial for bad actors to hijack offi­cial sources of elec­tion inform­a­tion — either through spoof­ing or hack­ing — repres­ents perhaps the gravest threat to the abil­ity of elec­tion offi­cials to dissem­in­ate accur­ate inform­a­tion and main­tain public trust. Accord­ingly, elec­tion agen­cies should invest in secur­ity meas­ures to protect those inform­a­tion sources from hack­ing and manip­u­la­tion. 

Spoof­ing protec­tions 

“Spoof­ing” is what happens when bad actors set up look-alike websites with similar or disguised URLs to divert audi­ences from offi­cial sources. foot­note22_20ehc88 22 Belfer Center, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 41; Alex Hern, “Unicode Trick Lets Hack­ers Hide Phish­ing URLs,” Guard­ian, April 19, 2017, https://www.theguard­ian.com/tech­no­logy/2017/apr/19/phish­ing-url-trick-hack­ers; and Michael Archam­bault, “The Danger of Spoofed Websites: Learn to Tell the Differ­ence,” PSafe dfndr blog, May 7, 2018, https://www.psafe.com/en/blog/the-danger-of-spoofed-websites-learn-to-tell-the-differ­ence/. In 2019, a govern­ment imposter sent a phish­ing scam to thou­sands of email addresses, using the web domain “uspsde­liv­ery-service.com” to mimic the U.S. Postal Service website. foot­note23_nqo4pyw 23 Bryan Camp­bell et al., “TA2101 Plays Govern­ment Imposter to Distrib­ute Malware to German, Italian, and US Organ­iz­a­tions,” Proof­point blog, Novem­ber 14, 2019, https://www.proof­point.com/us/threat-insight/post/ta2101-plays-govern­ment-imposter-distrib­ute-malware-german-italian-and-us. The offi­cial website of the Postal Service is usps.com. Offi­cials should regu­larly check for attempts to mimic their websites and consider using services that offer custom monit­or­ing.

Elec­tion agen­cies should use a .gov domain for their websites rather than other options such as .com, which are easier to spoof. foot­note24_t40rzr6 24 “Elec­tion offi­cials should obtain a .gov domain for an authen­tic­ated inter­net pres­ence.” Ad Hoc Commit­tee for 2020 Elec­tion Fair­ness and Legit­im­acy, Fair Elec­tions During a Crisis, 20. A survey by computer secur­ity company McAfee found that fewer than 2 in 10 counties in anti­cip­ated 2020 battle­ground states are doing so. foot­note25_mfzk6ex 25 McAfee, “Website Secur­ity Short­com­ings Could Render U.S. Elec­tions Suscept­ible to Digital Disin­form­a­tion,” accessed August 21, 2020, https://www.mcafee.com/enter­prise/en-us/assets/faqs/faq-elec­tion-2020.pdf. Using a .gov domain offers protec­tions like two-step veri­fic­a­tion and monit­or­ing from federal agen­cies like DHS, the General Services Admin­is­tra­tion, and the National Insti­tute of Stand­ards and Tech­no­logy. foot­note26_550wpgl 26 DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, Lever­aging the .gov Top-Level Domain, accessed August 21, 2020, https://www.cisa.gov/sites/default/files/public­a­tions/cisa-lever­aging-the-gov-top-level-domain.pdf.

Addi­tion­ally, agen­cies should encrypt their offi­cial websites with Secure Sock­ets Layer (SSL) tech­no­logy, which is signaled to users with an “HTTPS” prefix in the URL and, in most browsers, a lock icon. foot­note27_y5gu28a 27 McAfee, “Website Secur­ity Short­com­ings.” They should use search engine optim­iz­a­tion to improve their rank­ings in search engine results. Offi­cials should regu­larly monitor for search engine manip­u­la­tion by copycat websites and report spoof­ing attempts to the relev­ant search engine company. foot­note28_hs0n7nl 28 Google, “Report Phish­ing Page,” accessed August 17, 2020, https://safeb­rows­ing.google.com/safeb­rows­ing/report_phish/?hl=en.

Bad actors may set up fake social media accounts to masquer­ade as elec­tion offi­cials. foot­note29_ya2faqp 29 Belfer Center, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 46. During the 2016 elec­tion cycle, for example, oper­at­ives of Russia fooled some Twit­ter users into think­ing that their @TEN_GOP account was a mouth­piece of Tennessee Repub­lic­ans. foot­note30_uf8itud 30 Special Coun­sel Robert S. Mueller III, Report on the Invest­ig­a­tion into Russian Inter­fer­ence in the 2016 Elec­tion: Volume I of II, U.S. Depart­ment of Justice, 2019, 22, https://www.justice.gov/stor­age/report.pdf. Offi­cials should regu­larly monitor plat­forms for these copycat accounts and report them to the relev­ant social media compan­ies. foot­note31_tfyjwti 31 Distinct from vote suppres­sion harms, imper­son­a­tion viol­ates the terms of service of many plat­forms. Face­book, “Imper­son­a­tion and Hacked Accounts,” accessed August 21, 2020, https://www.face­book.com/help/532542166925473; and Twit­ter, “Report Imper­son­a­tion Accounts,” accessed August 21, 2020, https://help.twit­ter.com/en/safety-and-secur­ity/report-twit­ter-imper­son­a­tion.

Hack­ing protec­tions

Bad actors can hack websites, email accounts, and social media accounts to take control of offi­cial chan­nels. For example, in 2019, hack­ers took over Twit­ter CEO Jack Dorsey’s account and tweeted racist messages from his Twit­ter handle. foot­note32_4o8ip3u 32 Kevin Collier and Ahiza Garcia, “Jack Dorsey’s Twit­ter Account Was Hacked — and He’s the CEO of Twit­ter,” CNN, August 30, 2019, https://www.cnn.com/2019/08/30/tech/jack-dorsey-twit­ter-hacked/index.html. Late in the day of a primary elec­tion in Knox County, Tennessee, hack­ers shut down the local elec­tion commis­sion’s website for an hour, disrupt­ing the public­a­tion of early results. foot­note33_9wede13 33 Miles Parks, “Not Just Ballots: Tennessee Hack Shows Elec­tion Websites Are Vulner­able, Too,” NPR, May 17, 2018, https://www.npr.org/2018/05/17/611869599/not-just-ballots-tennessee-hack-shows-elec­tion-websites-are-vulner­able-too. And in a July 2020 scam, hack­ers took control of several prom­in­ent Twit­ter accounts, includ­ing those of politi­cians, and urged their follow­ers to send them Bitcoin. foot­note34_57i0337 34 Sheera Frankel et al., “A Brazen Online Attack Targets V.I.P. Twit­ter Users in a Bitcoin Scam,” New York Times, July 15, 2020, https://www.nytimes.com/2020/07/15/tech­no­logy/twit­ter-hack-bill-gates-elon-musk.html. Flor­ida prosec­utors have charged a 17-year-old hacker with master­mind­ing the attack. Nath­aniel Popper et al., “ From Mine­craft Tricks to Twit­ter Hack: A Flor­ida Teen’s Troubled Online Path,” New York Times, August 2, 2020, https://www.nytimes.com/2020/08/02/tech­no­logy/flor­ida-teen­ager-twit­ter-hack.html. Over­all, the risk for cyber­at­tacks — includ­ing ransom­ware attacks on elec­tion infra­struc­ture — has increased in 2020 as Amer­ic­ans have shif­ted to work­ing from home due to the Covid-19 pandemic. foot­note35_yfgzzso 35 David E. Sanger and Nicole Perl­roth, “Russian Crim­inal Group Finds New Target: Amer­ic­ans Work­ing at Home,” New York Times, June 25, 2020, https://www.nytimes.com/2020/06/25/us/polit­ics/russia-ransom­ware-coronavirus-work-home.html.

Offi­cials should imple­ment cyber­se­cur­ity best prac­tices, such as the Bren­nan Center’s detailed recom­mend­a­tions for elec­tion infra­struc­ture and staff train­ing. foot­note36_7dd08yo 36 Cortés et al., Prepar­ing for Cyber­at­tacks, 4–5. Agen­cies should main­tain tight controls on who has the abil­ity to make changes to offi­cial websites. When employ­ees leave, their creden­tials should be removed imme­di­ately. Log-in creden­tials for website edits should include strong pass­words that are changed regu­larly and multi­factor authen­tic­a­tion, and auto­matic logout should occur after a period of inactiv­ity. foot­note37_22kwkbs 37 There is published guid­ance on multi­factor authen­tic­a­tion. DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, “Multi-Factor Authen­tic­a­tion,” 2019, https://www.cisa.gov/sites/default/files/public­a­tions/cisa-multi-factor-authen­tic­a­tion.pdf; see also Cory Missimore, “The Multiple Options for Multi-Factor Authen­tic­a­tion,” ISACA Now blog, July 26, 2018, https://www.isaca.org/resources/news-and-trends/isaca-now-blog/2018/the-multiple-options-for-multi-factor-authen­tic­a­tion (explain­ing differ­ent forms of multi­factor authen­tic­a­tion). Agen­cies should install soft­ware updates imme­di­ately and consider the use of a web applic­a­tion fire­wall. foot­note38_93in­l5r 38 Center for Inter­net Secur­ity, “MS-ISAC Secur­ity Primers — SQLi,” accessed August 21, 2020, https://www.cise­cur­ity.org/white-papers/sqli/.

Elec­tion agen­cies should cata­log social media accounts — includ­ing those that belong to the agency itself and those of indi­vidual offi­cials — along with a list of indi­vidu­als with log-in creden­tials for each account. foot­note39_r6qr3dz 39 Belfer Center, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 18. Offi­cials should protect their accounts with strong pass­words and multi­factor authen­tic­a­tion, and they should change pass­words regu­larly, espe­cially as an elec­tion approches. Agen­cies should consider a compre­hens­ive social media policy that directs staff to exer­cise caution when post­ing elec­tion inform­a­tion, even on personal accounts, to prevent them from shar­ing inac­cur­ate inform­a­tion. foot­note40_4t2fu85 40 Center for Inter­net Secur­ity, “Social Media: The Pros, Cons, and the Secur­ity Policy,” March 2020, https://www.cise­cur­ity.org/news­let­ter/social-media-the-pros-cons-and-the-secur­ity-policy/. Staff should receive regu­lar train­ing on how to avoid phish­ing scams. They should never use personal email accounts for offi­cial busi­ness. foot­note41_xa59m9h 41 Area 1, Phish­ing Elec­tion Admin­is­trat­ors, 2020, https://cdn.area1se­cur­ity.com/reports/Area-1-Secur­ity-Phish­ing­ForE­lec­tion­Ad­min­is­trat­ors.pdf.

Offi­cials should monitor agency websites and social media accounts at least once a day for hack­ing and prepare to recover hacked sites and accounts. They should identify points of contact with social media compan­ies for troubleshoot­ing prob­lems like losing control of an account. foot­note42_onef4sp 42 Belfer Center, The State and Local Elec­tion Cyber­se­cur­ity Play­book, 18. Some plat­forms offer enhanced protec­tion for offi­cial accounts. foot­note43_tlc3l1p 43 Face­book, for example, offers enhanced protec­tion for offi­cial accounts through its “Face­book Protect” service. Face­book, “What is Face­book Protect?” accessed August 21, 2020, https://www.face­book.com/gpa/face­book-protect#tab-0-what-is-face­book-protect-.

Elec­tion agen­cies should back up websites and other resources that will be needed by voters confused by disin­form­a­tion, like polling place lookup tools and voter regis­tra­tion checks. foot­note44_36gh­szm 44 Cortés et al., Prepar­ing for Cyber­at­tacks, 7. Google Project Shield is a free service that protects elec­tion websites from distrib­uted denial-of-service (DDoS) attacks. Google Project Shield, “Protect­ing Free Expres­sion from Digital Attacks,” accessed August 21, 2020, https://project­shield.with­google.com/land­ing. They should have contin­gency plans for how they will keep voters informed if online tools are disabled. It is also useful to stress test web-based tools and data­bases ahead of time to learn how much traffic they can handle. foot­note45_n52w5pw 45 “Offi­cials should ensure that state and county elec­tion websites undergo peri­odic inde­pend­ent load and vulner­ab­il­ity test­ing, as these websites will get heav­ier usage while the public prac­tices social distan­cing” due to Covid-19. Cortés et al., Prepar­ing for Cyber­at­tacks, 21.

The Elec­tion Infra­struc­ture Inform­a­tion Shar­ing and Analysis Center (EI-ISAC), a nonprofit member­ship organ­iz­a­tion for elec­tion offi­cials, offers guid­ance on how to secure websites against hack­ing, as well as a service that checks domains for outdated soft­ware. foot­note46_9thn­fcs 46 Center for Inter­net Secur­ity, “Cyber­se­cur­ity Spot­light — Website Deface­ments,” accessed August 21, 2020, https://www.cise­cur­ity.org/spot­light/cyber­se­cur­ity-spot­light-website-deface­ments/. For more inform­a­tion about the services avail­able to local govern­ments who join the Center for Inter­net Secur­ity’s Elec­tion Infra­struc­ture Inform­a­tion Shar­ing and Analysis Center (EI-ISAC), see Center for Inter­net Secur­ity, “EI-ISAC Services,” accessed August 21, 2020, https://www.cise­cur­ity.org/ei-isac/ei-isac-services/. CISA offers cyber­se­cur­ity assess­ments that can help secure offi­cial web portals and protect against phish­ing attacks that may endanger website creden­tials. foot­note47_7xzooft 47 DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, “Cyber Resource Hub,” last modi­fied July 24, 2020, https://www.cisa.gov/cyber­se­cur­ity-assess­ments; and DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, #Protect2020 Stra­tegic Plan, 13.  

4. Monitor for disin­form­a­tion

Offi­cials must actively monitor for disin­form­a­tion that could suppress votes in their juris­dic­tion. This requires dedic­at­ing staff time to search­ing for it and creat­ing chan­nels for others to report incid­ents. Offi­cials should also work with social media compan­ies, noti­fy­ing them of activ­ity that may viol­ate their terms of use, and parti­cip­ate in broader inform­a­tion-shar­ing networks.

Set up an ongo­ing monit­or­ing oper­a­tion

Elec­tion offi­cials in every state — ideally at both the state and local levels — should monitor social media for false inform­a­tion about how to vote. foot­note48_00154uq 48 See, e.g., Wiscon­sin Elec­tions Commis­sion, Elec­tion Secur­ity Report, 2019, 49, https://elec­tions.wi.gov/sites/elec­tion­suat.wi.gov/files/2019–09/2019%20Elec­tions%20Se­cur­ity%20Plan­ning%20Re­port.pdf. (“WEC staff has worked with offi­cials at major social media compan­ies to quickly commu­nic­ate any attempts to misin­form voters and to have the offend­ing posts removed.”); and Univer­sity of Pitt­s­burgh, The Blue Ribbon Commis­sion on Pennsylvani­a’s Elec­tion Secur­ity, 52 (“Relev­ant offi­cials need to be ready to contact social media compan­ies to alert them to [disin­form­a­tion and] have a reli­able and widely known set of social media accounts to rebut disin­form­a­tion . . . .”). They should conduct monit­or­ing efforts in consulta­tion with legal coun­sel. foot­note49_3unmxtq 49 Judd Choate, Color­ado Director of Elec­tions, personal commu­nic­a­tion with author, July 19, 2020. Monit­or­ing should focus on the relev­ant juris­dic­tion, of course, as well as communit­ies typic­ally targeted by vote suppres­sion. Search terms should include concepts around voting logist­ics, includ­ing dates and loca­tions for voting and details about voting by mail. Disin­form­a­tion is also likely to mention candid­ates or parties stand­ing in local elec­tions.

There are services that can assist with monit­or­ing efforts. The MITRE Corpor­a­tion, for example, offers a tool designed to quickly analyze reports of disin­form­a­tion and help elec­tion offi­cials report incid­ents up their agency hier­archy. foot­note50_cq4r9fj 50 Molly Manchenton, “SQUINT Sharpens Offi­cials’ Perspect­ive to Combat Elec­tion Distor­tion,” MITRE, Febru­ary 2020, https://www.mitre.org/public­a­tions/project-stor­ies/squint-sharpens-offi­cials-perspect­ive-to-combat-elec­tion-distor­tion. Commer­cial social media monit­or­ing or “listen­ing” services — some avail­able for free — can track mentions of elec­tions or voting proced­ures in a juris­dic­tion.

Set up chan­nels for offi­cials to receive reports of disin­form­a­tion

Elec­tion offi­cials should provide and publi­cize a clear line of commu­nic­a­tion for voters, journ­al­ists, inter­net compan­ies, and offi­cials in other juris­dic­tions to report decept­ive prac­tices. At a minimum, there should be an email address and phone number, but offi­cial social media accounts are also useful. For example, Cali­for­ni­a’s secret­ary of state launched Vote Sure, an initi­at­ive to increase voter aware­ness of false and mislead­ing inform­a­tion, and has direc­ted the public to report disin­form­a­tion to a dedic­ated email address. foot­note51_j8c8nod 51 Cali­for­nia Secret­ary of State, “Vote­Sure Voting Resources” (“Report Misin­form­a­tion” link), accessed August 21, 2020, https://www.sos.ca.gov/elec­tions/vote-sure/.

Report content that viol­ates plat­form policies and check for take­downs 

Elec­tion offi­cials should report instances of disin­form­a­tion to the relev­ant inter­net company. While the plat­forms’ terms of use differ, most major social media compan­ies have some form of prohib­i­tion against decep­tion about voting. Elec­tion offi­cials should estab­lish contacts at key plat­forms in advance of the elec­tion. They should report disin­form­a­tion to their specific point of contact — not through a “flag­ging” option, which is often an inad­equate mech­an­ism to address voter decep­tion and leaves no way to monitor plat­form responses. On Face­book, for example, users can report content as “voter inter­fer­ence,” but that only trig­gers monit­or­ing for trends in the aggreg­ate; it does not lead to a manual review for take­downs. foot­note52_s5o6sar 52 Laura W. Murphy and Megan Cacace, “Face­book’s Civil Rights Audit: Final Report,” Face­book, 2020, 32–33, https://about.fb.com/wp-content/uploads/2020/07/Civil-Rights-Audit-Final-Report.pdf. Offi­cials should look online to check whether false posts, shares, or retweets have actu­ally been removed, and if the disin­form­a­tion is still live, they should follow up with their contacts.

Parti­cip­ate in inform­a­tion-shar­ing networks

Offi­cials should parti­cip­ate in inform­a­tion-shar­ing networks with other elec­tion offi­cials, federal govern­ment agen­cies, inter­net compan­ies, and community groups. For example, the national nonpar­tisan coali­tion Elec­tion Protec­tion, which oper­ates the voter help line 1–866-OUR-VOTE, provides voter inform­a­tion and main­tains contact with elec­tion offi­cials across the coun­try. Elec­tion Protec­tion coali­tion members can alert affected communit­ies of disin­form­a­tion and push correct­ive inform­a­tion out to indi­vidu­als.

Agen­cies parti­cip­at­ing in the EI-ISAC have access to a plat­form for shar­ing threat inform­a­tion in real-time. foot­note53_sjytksf 53 DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, #Protect2020 Stra­tegic Plan, 14. At DHS, CISA plans to oper­ate a switch­board during the elec­tion that will trans­mit reports of disin­form­a­tion from elec­tion offi­cials to inter­net compan­ies and law enforce­ment. foot­note54_i5xepfl 54 DHS, Cyber­se­cur­ity and Infra­struc­ture Secur­ity Agency, #Protect2020 Stra­tegic Plan, 22.

Where appro­pri­ate, elec­tion agen­cies should report incid­ents to federal and local law enforce­ment.

5. Build rela­tion­ships with communit­ies and media

Elec­tion offi­cials should do extens­ive public outreach to build trust with all the communit­ies they serve, and those efforts should incor­por­ate all commonly spoken languages in those communit­ies. Cultiv­at­ing ongo­ing rela­tion­ships with communit­ies will make it easier to commu­nic­ate during an emer­gency.

Elec­tion agen­cies should consider desig­nat­ing a spokes­per­son to provide accur­ate inform­a­tion on how to vote, to promote that inform­a­tion online and on social media, and to make local media appear­ances. foot­note55_zal1n4x 55 There are lessons to be learned in the Belfer Center’s guide for offi­cial commu­nic­a­tions plans around cyber incid­ents, which provides samples for desig­nat­ing offi­cial roles, setting out processes, and craft­ing check­lists. The desig­nated offi­cial will bene­fit from media train­ing. Belfer Center, Elec­tion Cyber Incid­ent Commu­nic­a­tions Plan Template, 18. This desig­nated offi­cial should also build rela­tion­ships with community groups. foot­note56_8mkt2pz 56 See, e.g., Wiscon­sin Elec­tions Commis­sion, Elec­tion Secur­ity Report, 49.

Agen­cies can build public trust and famili­ar­ity by work­ing to increase the number of follow­ers on offi­cial social media accounts, a process that could include advert­ising and enga­ging in conver­sa­tion with community lead­ers, journ­al­ists, and other relev­ant figures. They should also consider recruit­ing high-profile surrog­ates, such as influ­en­cers with signi­fic­ant follow­ings on partic­u­lar social media chan­nels, to amplify accur­ate inform­a­tion.

Local and ethnic media that serve frequently targeted communit­ies are key part­ners in dissem­in­at­ing correct inform­a­tion in response to decept­ive prac­tices. Offi­cials should build rela­tion­ships with outlets and report­ers to help estab­lish them­selves as trus­ted sources. foot­note57_dl04pgo 57 Belfer Center, Elec­tion Cyber Incid­ent Commu­nic­a­tions Plan Template, 18. Media part­ners should receive instruc­tion in advance on how to avoid repeat­ing false­hoods when report­ing incid­ents. foot­note58_46trn2z 58 The Amer­ican Press Insti­tute’s Trus­ted Elec­tions Network is a resource for journ­al­ists cover­ing the spread of false inform­a­tion around elec­tions. Amer­ican Press Insti­tute, “Trus­ted Elec­tions Network,” accessed August 21, 2020, https://www.amer­ic­an­press­in­sti­tute.org/trus­ted-elec­tions-network/. It includes advice about how journ­al­ists should cover misin­form­a­tion, includ­ing by going to offi­cial sources. Amer­ican Press Insti­tute, “Trus­ted Elec­tions Network Resource Guide,” accessed August 21, 2020, https://www.amer­ic­an­press­in­sti­tute.org/trus­ted-elec­tions-network/resource-guide/.

End Notes

Recommendations for Internet Companies

Many of the biggest social media plat­forms have banned disin­form­a­tion about voting proced­ures and will take down posts with incor­rect inform­a­tion when they find them. Face­book, for example, bans the misrep­res­ent­a­tion of voting times, loca­tions, meth­ods, and eligib­il­ity, along with threats of viol­ence related to voting or regis­ter­ing to vote. foot­note1_q4sku43 1 Guy Rosen et al., “Help­ing to Protect the 2020 US Elec­tions,” Face­book, Octo­ber 21, 2019, https://about.fb.com/news/2019/10/update-on-elec­tion-integ­rity-efforts/. Despite Face­book’s policy of allow­ing politi­cians to make false claims in ads, it appears that the ban on voter suppres­sion applies to every­one. (“We remove this type of content regard­less of who it’s coming from . . . .”). Other popu­lar social networks like YouTube, Twit­ter, and Pinterest have similar policies, and Google bans ads that contain false inform­a­tion about voting. foot­note2_kjybyqt 2 Pinterest waited until Janu­ary of 2020 to ban vote suppres­sion. Cat Zakrzewski, “The Tech­no­logy 202: Pinterest Bans Misin­form­a­tion About Voting and the Census,” Wash­ing­ton Post, Janu­ary 29, 2020, https://www.wash­ing­ton­post.com/news/power­post/paloma/the-tech­no­logy-202/2020/01/29/the-tech­no­logy-202-pinterest-bans-misin­form­a­tion-about-voting-and-the-census/5e307ba288e0­fa6ea99d60fc/; and Google, “Misrep­res­ent­a­tion,” accessed August 21, 2020, https://support.google.com/adspolicy/answer/6020955?hl=en&ref_topic=1626336 (prohib­it­ing “claims that are demon­strably false and could signi­fic­antly under­mine parti­cip­a­tion or trust in an elect­oral or demo­cratic process,” such as “inform­a­tion about public voting proced­ures”).

However, it is unclear how reli­able these disin­form­a­tion bans are. Enforce­ment can be uneven, and compan­ies frequently change their policies. This section outlines the essen­tial actions that all inter­net compan­ies, includ­ing social networks, ad sellers, and search engines, should take to combat disin­form­a­tion about voting.

1. Proact­ively provide inform­a­tion about how to vote

Social media plat­forms, search engines, and other web and mobile sites should point all users to accur­ate inform­a­tion about voting, includ­ing how to register and vote, relev­ant dead­lines, and how to contact the appro­pri­ate elec­tion offi­cials. foot­note3_f4wc6gn 3 John Borth­wick, “Ten Things Tech­no­logy Plat­forms Can Do to Safe­guard the 2020 Elec­tion,” Medium blog post, Janu­ary 7, 2020, https://render.beta­works.com/ten-things-tech­no­logy-plat­forms-can-do-to-safe­guard-the-2020-u-s-elec­tion-b0f73b­cccb8. They should make this inform­a­tion avail­able long before the elec­tion but increase its visib­il­ity as voting or regis­tra­tion dead­lines approach.

Inter­net compan­ies should direct users to reli­able sources like offi­cial elec­tion sites or the “Can I Vote” resource page prepared by NASS. foot­note4_frs9onz 4 National Asso­ci­ation of Secret­ar­ies of State, “Can I Vote,” accessed August 21, 2020, https://www.nass.org/can-I-vote. They should promote posts appear­ing on the offi­cial accounts of elec­tion agen­cies. Algorithms should prior­it­ize offi­cial posts to make them more prom­in­ent in users’ feeds, in search results, and in lists of trend­ing topics.

2. Main­tain clear chan­nels for report­ing disin­form­a­tion

Compan­ies should make it easy for users, offi­cials, and community groups to report false inform­a­tion about voting. Social media plat­forms should offer users a clear and access­ible option to tag posts as voter suppres­sion, and those reports should be subject to imme­di­ate review by an employee, which can help determ­ine the context of posts and whether they pose a risk of voter suppres­sion. One issue with review by algorithms is that they frequently cannot distin­guish between a message push­ing disin­form­a­tion and a message quot­ing and debunk­ing a lie. For example, a post by a get-out-the-vote group warn­ing community members that bad actors are circu­lat­ing false “vote Wednes­day” messages should not be treated the same as an attempt to trick people out of voting on Elec­tion Day. Human review is more likely to take contex­tual factors like these into account.

Inter­net compan­ies should build rela­tion­ships with state elec­tion offi­cials to facil­it­ate inform­a­tion shar­ing in both direc­tions.

3. Take down false inform­a­tion about voting but preserve the data

Social media compan­ies should ban disin­form­a­tion that attempts to suppress the vote — and do so with clear stand­ards that are trans­par­ent to users. They should remove such disin­form­a­tion promptly but also provide an access­ible appeals option as part of the review process. Users whose posts are removed should be able to quickly contact a company employee to make the case that the removal was a mistake.

Compan­ies should monitor their plat­forms for repeat offend­ers and impose more severe consequences on them. YouTube, for example, bans vote suppres­sion and has a sched­ule of escal­at­ing consequences for viol­a­tions of its community guidelines. If a user gets three strikes within 90 days, YouTube will perman­ently delete that person’s account. foot­note5_06c6f98 5 Google, “YouTube Help: Spam, Decept­ive Prac­tices & Scams Policies,” accessed August 21, 2020, https://support.google.com/youtube/answer/2801973; and Google, “YouTube Help: Community Guidelines Strike Basics,” accessed August 21, 2020, https://support.google.com/youtube/answer/2802032.

Even when a social media company removes posts and accounts, it should preserve the data for analysis by research­ers, which can help identify patterns in vote suppres­sion activ­ity to better prevent it in the future. These data may provide valu­able inform­a­tion on, for example, decept­ive foreign influ­ence campaigns, such as the inter­fer­ence by oper­at­ives of Russia in the 2016 elec­tion. foot­note6_msowuh0 6 Mueller, Report on the Invest­ig­a­tion into Russian Inter­fer­ence, 4. Compan­ies should retain these data in a manner that is access­ible to and search­able by research­ers consist­ent with users’ privacy interests.

4. Protect offi­cial accounts and websites

Social media plat­forms should provide special protec­tion for elec­tion offi­cials’ accounts against hack­ing and spoof­ing. Face­book, for example, offers enhanced protec­tion for offi­cial accounts through its “Face­book Protect” service. foot­note7_kiy4zfj 7 Face­book, “What is Face­book Protect?” Face­book offers veri­fied badges. Face­book, “How do I request a veri­fied badge on Face­book?” accessed August 21, 2020, https://www.face­book.com/help/1288173394636262. As of this writ­ing, Twit­ter’s veri­fied badge applic­a­tion process is on hold. Twit­ter, “About Veri­fied Accounts,” accessed August 21, 2020, https://help.twit­ter.com/en/managing-your-account/about-twit­ter-veri­fied-accounts. Social media compan­ies should offer veri­fied status for offi­cial accounts, like Twit­ter’s “blue check,” through a process that accom­mod­ates the real­it­ies of local elec­tion offi­cials’ oper­a­tions. For instance, local elec­tion offi­cials are less likely to have a large social media audi­ence, so a high follower count should not be a prerequis­ite for obtain­ing veri­fied status. Instead, social media plat­forms should allow state offi­cials to certify accounts that belong to local offi­cials.

Search engine compan­ies should proact­ively watch their plat­forms for attempts to direct users to spoofed websites.

5. Push correct­ive inform­a­tion to specific users affected by disin­form­a­tion

When social media compan­ies find false inform­a­tion about voting on their plat­forms, they should identify which users received the disin­form­a­tion and notify those indi­vidu­als with messages that include accur­ate inform­a­tion and direc­tions for how to contact elec­tion offi­cials. In 2017, Face­book said it would be “chal­len­ging” to notify users who saw decept­ive polit­ical ads from a Russian company with ties to Putin. foot­note8_bz6jhe7 8 Tony Romm and Kurt Wagner, “Here’s How to Check if You Inter­ac­ted with Russian Propa­ganda on Face­book During the 2016 Elec­tion,” Decem­ber 22, 2017, Vox, https://www.vox.com/2017/12/22/16811558/face­book-russia-trolls-how-to-find-propa­ganda-2016-elec­tion-trump-clin­ton. But the industry has had two years to solve any tech­nical chal­lenges. In fact, in 2020, Face­book has contac­ted users who inter­ac­ted with false inform­a­tion about Covid-19, prov­ing the feas­ib­il­ity of the prac­tice. foot­note9_06h5jhf 9 “New alert messages will appear in the News­feeds of people who have liked, reacted to or commen­ted on known [Covid-19] misin­form­a­tion, connect­ing people to WHO myth-bust­ing.” Chloe Colli­ver and Jennie King, The First 100 Days: Coronavirus and Crisis Manage­ment on Social Media Plat­forms, Insti­tute for Stra­tegic Dialogue, 2020, 35, https://www.isdg­lobal.org/isd-public­a­tions/the-first-100-days/; and Billy Perrigo, “Face­book Is Noti­fy­ing Users Who Have Shared Coronavirus Misin­form­a­tion. Could It Do the Same for Polit­ics?” Time, April 16, 2020, https://time.com/5822372/face­book-coronavirus-misin­form­a­tion/. Social media compan­ies should not oper­ate a service that feeds disin­form­a­tion to voters without ensur­ing they can provide those users with a correct­ive remedy.

End Notes

Recommendations for Federal Action

Congress

Congress should clarify and strengthen prohib­i­tions against voter suppres­sion through disin­form­a­tion. foot­note1_f9ek­isp 1 Wendy Weiser and Alicia Bannon, eds., Demo­cracy: An Elec­tion Agenda for Candid­ates, Activ­ists, and Legis­lat­ors, Bren­nan Center for Justice, 2018, 12–13, https://www.bren­nan­cen­ter.org/our-work/policy-solu­tions/demo­cracy-elec­tion-agenda-candid­ates-activ­ists-and-legis­lat­ors. It can do this by passing the Decept­ive Prac­tices and Voter Intim­id­a­tion Preven­tion Act. foot­note2_emxjtbz 2 Decept­ive Prac­tices and Voter Intim­id­a­tion Preven­tion Act of 2019, H.R. 3281, 116th Cong. (2019), included in the For the People Act of 2019, H.R. 1, 116th Cong. (2019). Although federal law already bans inten­tional efforts to deprive others of their right to vote, exist­ing laws have not been strong enough or specific enough to deter miscon­duct. And exist­ing law does not give any federal author­ity the mandate to invest­ig­ate decept­ive prac­tices and provide voters with correc­ted inform­a­tion. foot­note3_3enycpj 3 Wendy Weiser et al., The Case for H.R. 1, Bren­nan Center for Justice, 2020, 6, https://www.bren­nan­cen­ter.org/our-work/policy-solu­tions/case-hr1.

The Decept­ive Prac­tices and Voter Intim­id­a­tion Preven­tion Act would clearly prohibit attempts to block people from voting or regis­ter­ing to vote, includ­ing by making false or mislead­ing state­ments. It would impose crim­inal penal­ties on offend­ers and enable citizens to go to court to stop voter decep­tion. And it would require the U.S. attor­ney general to affirm­at­ively correct disin­form­a­tion if elec­tion offi­cials fail to. These improve­ments would give federal law enforce­ment agen­cies and the public more tools to stop bad actors from attack­ing the right to vote. The House of Repres­ent­at­ives passed a version of this legis­la­tion in 2019 as part of H.R. 1, also known as the For the People Act, an omni­bus demo­cracy reform bill. foot­note4_sni89ed 4 For the People Act, H.R. 1, 116th Cong. (2019).

Federal agen­cies

Federal agen­cies like DHS and the FBI are shar­ing inform­a­tion with local govern­ment offi­cials about disin­form­a­tion threats. The Depart­ment of Justice, whose Voting Section enforces federal laws around voting prac­tices, should parti­cip­ate in shar­ing inform­a­tion and, as in past elec­tions, prosec­ute indi­vidu­als and entit­ies that viol­ate the law by spread­ing disin­form­a­tion about how to vote. foot­note5_7cyr­uqs 5 For example, prosec­utors work­ing with Special Coun­sel Robert Mueller charged compan­ies and indi­vidu­als involved in the Russian Inter­net Research Agency’s attempts to inter­fere with the 2016 elec­tion through disin­form­a­tion with conspir­acy to defraud the United States. Mueller, Report on the Invest­ig­a­tion into Russian Inter­fer­ence, 9. Gener­ally, federal agen­cies that discover disin­form­a­tion that might block people from voting should take the follow­ing steps:

  • alert local elec­tion offi­cials, media, and community groups
  • take meas­ures to ensure that correct inform­a­tion is publi­cized in a manner that reaches the disin­form­a­tion targets
  • refer the matter to the appro­pri­ate agency to invest­ig­ate decept­ive prac­tices for prosec­u­tion after the elec­tion

End Notes

Conclusion

There can be no doubt that disin­form­a­tion designed to suppress the vote will continue to be a threat in the 2020 elec­tions and beyond. State and local elec­tion offi­cials are our most import­ant defense. They must plan and prepare now by protect­ing their abil­ity to commu­nic­ate correct­ive inform­a­tion and defend­ing their infra­struc­ture against attacks. Inter­net compan­ies, too, must make their services as safe as possible for voters and keep them from being easy tools for bad actors to hood­wink the elect­or­ate. And state and local offi­cials need federal support to protect access to voting in the face of digital decep­tion.

At the same time, all inter­net users are respons­ible for help­ing to stop the spread of digital disin­form­a­tion about voting. Users who see a message that seems suspi­cious or outland­ish — or fits too neatly into a partisan polit­ical narrat­ive — should invest­ig­ate before shar­ing. It is useful to consider the source, to find out the speak­er’s polit­ical agenda and what other content they have shared, like satire or jokes. Head­lines and tweets are short and often mislead­ing, so it is neces­sary to read more, look­ing for specific facts and check­ing dates to tell whether the story is accur­ate and current. Finally, one of the most import­ant tools to check for disin­form­a­tion is “lateral read­ing,” when the reader sets the message aside and searches for the inform­a­tion else­where, or other sources that have already determ­ined whether it is a hoax.

Users who see some­thing suspi­cious, incred­ible, or shock­ing online about voting should check with state and local elec­tion offi­cials for the author­it­at­ive answer. NASS provides contact inform­a­tion for voting offi­cials on its “Can I Vote” resource page. foot­note1_9rq40pl 1 National Asso­ci­ation of Secret­ar­ies of State, “Can I Vote.” If the inform­a­tion is false, offi­cials will want to know about it. Members of the public can also share alerts about disin­form­a­tion in their communit­ies with a call to Elec­tion Protec­tion at 1–866-OUR-VOTE. Elec­tion Protec­tion volun­teers will record incid­ents, look for patterns, and help pass inform­a­tion to inter­net compan­ies and offi­cials. Social media users can help spread correct inform­a­tion once they have confirmed it by shar­ing it in their own networks without repeat­ing the false­hood.

Amid a global pandemic, foreign inter­fer­ence, partisan dirty tricks by domestic polit­ical oper­at­ives, and digital tools giving bad actors the power to widely dissem­in­ate messages instantly, the threat from disin­form­a­tion is greater than ever before. But with aware­ness of the threat and time to prepare, the govern­ment, inter­net compan­ies, and the public can work together to protect every­one’s abil­ity to vote in 2020.

End Notes