Skip Navigation
Analysis

New Evidence Shows How Russia’s Election Interference Has Gotten More Brazen

The Kremlin-linked operation behind 2016 election meddling is using similar tactics for 2020, plus some new ones.

March 5, 2020
digital
Yuichiro Chino

Intel­li­gence offi­cials have reportedly found that Russia is inter­fer­ing in the 2020 elec­tions to try to support Pres­id­ent Trump’s reelec­tion, while also meddling in the Demo­cratic primar­ies to help Sen. Bernie Sanders’ campaign. The reports have not revealed details about what actions Russia is taking or their scope, but my analysis of social media activ­ity exposes some examples.

I found that social media accounts linked to the Inter­net Research Agency (IRA), the Krem­lin-linked company behind an influ­ence campaign that targeted the 2016 elec­tions, have indeed already begun their digital campaign to inter­fere in the 2020 pres­id­en­tial elec­tion. And they are getting even more brazen in tactics, as a sample of new posts shows.

In Septem­ber 2019, just a few months ahead of the Demo­cratic primar­ies, I noticed some posts on Instagram that appeared to use the strategies and tactics very similar to those of the IRA that I observed in my research on Russian inter­fer­ence in the 2016 elec­tions on social media. A few weeks later, Face­book announced that it had taken down about 75,000 posts across 50 IRA-linked accounts from Face­book (one account) and Instagram (50 accounts).

My team at Project DATA (Digital Ad Track­ing & Analysis) happened to capture some of these posts on Instagram before Face­book removed them. We iden­ti­fied 32 accounts that exhib­ited the attrib­utes of the IRA, and 31 of them were later confirmed to be the IRA-linked accounts by Graph­ika, a social media analysis firm commis­sioned by Face­book to exam­ine the accounts.

Some strategies and tactics for elec­tion inter­fer­ence were the same as before. Russi­a’s trolls preten­ded to be Amer­ican people, includ­ing polit­ical groups and candid­ates. They tried to sow divi­sion by target­ing both the left and right with posts to foment outrage, fear, and hostil­ity. Much of their activ­ity seemed designed to discour­age certain people from voting. And they focused on swing states.

But the IRA’s approach is evolving. Its trolls have gotten better at imper­son­at­ing candid­ates and parties, more closely mimick­ing logos of offi­cial campaigns. They have moved away from creat­ing their own fake advocacy groups to mimick­ing and appro­pri­at­ing the names of actual Amer­ican groups. And they’ve increased their use of seem­ingly nonpolit­ical content and commer­cial accounts, hiding their attempts to build networks of influ­ence.

Continu­ing the same strategies and tactics

Over­all, the IRA appears to still employ many of the same strategies and tactics as in 2016: posing as domestic actors, the IRA targeted both sides of the ideo­lo­gical spec­trum with wedge issues. Espe­cially notice­able were same-side candid­ate attacks (i.e., an “in-kind candid­ate attack” target­ing the likely voters of the candid­ate), a type of voter suppres­sion strategy designed to break the coali­tion of one side or the other.

Fraud­u­lent iden­tity

The IRA used generic names or mimicked exist­ing names similar to domestic polit­ical, grass­roots, and community groups, as well as the candid­ates them­selves.

Target­ing both sides

We made infer­ences on target­ing based on account names, follow­ings, and follow­ers because unlike our stud­ies conduc­ted in the 2016 elec­tions (e.g., Kim et al., 2018), we were unable to collect inform­a­tion about the users who were exposed to the posts this time.

The IRA targets both sides of the ideo­lo­gical spec­trum to sow divi­sion. This strategy is unique to Russian elec­tion campaigns, making it differ­ent than conven­tional persua­sion-oriented propa­ganda or other foreign coun­tries’ elec­tion inter­fer­ence strategies.

The divide between the police and the Black community, for instance, has been a running theme of the IRA’s influ­ence campaigns, as clearly exhib­ited in IRA activ­it­ies between 2014 and 2017 through posts around “Blue Lives Matter” vs. “Black Lives Matter.” Further­more, the IRA exag­ger­ated a sharp divi­sion in the African Amer­ican community.   

In the context of the 2020 elec­tions, I found both endorse­ment and attack messages for major candid­ates, parties, and politi­cians includ­ing the pres­id­ent. Compared to the posts high­light­ing exist­ing divides around social iden­tit­ies or issues, elec­tion-related endorse­ments and attack posts are more direct, honed, and straight­for­ward.

Social media image Source: Instagram

Note on images: All images are from Instagram (Septem­ber 2019). The posts and iden­ti­fied accounts were later taken down by the company for links to the Inter­net Research Agency. The iden­tit­ies of non-IRA parties includ­ing domestic polit­ical groups’ logos, the faces of ordin­ary citizens, and comments by non-IRA users are redac­ted.

Anti-trump social media image Source: Instagram
Anti-Warren social media image Source: Instagram

Wedge issues

The major­ity of the IRA’s influ­ence campaigns are indeed issue or interest based. The IRA is well-versed enough in the history and culture of our polit­ics to exploit sharp polit­ical divi­sions already exist­ing in our soci­ety. Target­ing those who are likely to be inter­ested in a partic­u­lar issue but dissat­is­fied with the current party plat­forms or policies, the IRA campaigns often create an “us vs. them” discourse, feed­ing fear to activ­ate or demo­bil­ize those who consider an issue person­ally import­ant.

My analysis of the IRA campaigns between 2014 and 2017 found that race, Amer­ican nation­al­ism/patri­ot­ism, immig­ra­tion, gun control, and LGBT issues were the top five issues most frequently discussed in the IRA’s campaigns.

Simil­arly, the issues frequently mentioned in the IRA’s posts in 2019 include racial iden­tity/conflicts, anti-immig­ra­tion (espe­cially anti-Muslim), nation­al­ism/patri­ot­ism, sectari­an­ism, and gun rights.

Veterans social media image Source: Instagram
Anti-Muslim social media image Source: Instagram
Social media image Source: Instagram

Targets of those issue or interest based posts include veter­ans, work­ing-class whites in rural areas, and nonwhites, espe­cially African Amer­ic­ans.

One notable trend is the increase in the discus­sion of femin­ism at both ends of the spec­trum.

Social media image Source: Instagram
Social media image Source: Instagram

It is also notable that one of the accounts, stop.trump2020, was fully devoted to anti-Trump messaging, similar to the IRA’s organ­iz­a­tion of the post-elec­tion rally, “Not My Pres­id­ent.”

Geographic target­ing

Geograph­ic­ally, these accounts specific­ally target battle­ground states includ­ing Michigan, Wiscon­sin, Flor­ida, Ohio, and Arizona.

Voter suppres­sion

Draw­ing upon the liter­at­ure (e.g., Wang 2012), I define voter suppres­sion as a strategy to break the coali­tion of the oppos­i­tion. Accord­ingly, I identify four types of voter suppres­sion messages: elec­tion boycott, decep­tion (lying about time, loca­tion or manner of voting), third-party candid­ate promo­tion (e.g., the promo­tion of Jill Stein target­ing likely Hillary Clin­ton voters), and same-side candid­ate attack (i.e., an in-kind candid­ate attack, such as an attack on Clin­ton target­ing likely Clin­ton voters).

Among the posts we captured in Septem­ber 2019, I did not notice any messages that promoted elec­tion boycotts or decep­tions yet, perhaps because those types of voter suppres­sion campaigns usually occur right before the elec­tions, thus it was too early to observe them.

However, I found other types of voter suppres­sion tactics, such as “third-candid­ate” promo­tion (e.g., promo­tion of Rep. Tulsi Gabbard) and same-side candid­ate attacks, both target­ing likely Demo­cratic support­ers.

In partic­u­lar, the same-side candid­ate attack tactic that centers around major candid­ates in the Demo­cratic primar­ies is very common. For example, trolls targeted liberal femin­ists with attacks on former Vice Pres­id­ent Joe Biden portray­ing him as enga­ging in inap­pro­pri­ate touch­ing.

Social media image Source: Instagram

In another example, the IRA targeted African Amer­ic­ans for heavy attacks on Sen. Kamala Harris.

Social media image Source: Instagram

Do the IRA-linked groups prefer Pres­id­ent Trump and Bernie Sanders?

Recent reports have indic­ated that Russia is inter­fer­ing in the 2020 elec­tions in support of Pres­id­ent Trump’s reelec­tion and Sander­s’s bid for the Demo­cratic pres­id­en­tial nomin­a­tion.

In 2016, my analysis showed that while the IRA’s voter suppres­sion campaigns on social media clearly targeted likely Clin­ton voters, espe­cially nonwhite voters, no single voter suppres­sion message targeted likely Trump voters.

As to whether Russia-linked groups are trying to aid Trump or Sanders over other candid­ates in 2020, unfor­tu­nately, this analysis itself cannot provide a defin­ite answer yet.

Note, unlike my previ­ous stud­ies that examined the corpus of all of the digital campaigns exposed to a repres­ent­at­ive sample of the U.S. voting age popu­la­tion (87 million ads exposed to nearly 17,000 indi­vidu­als who repres­en­ted the U.S. voting age popu­la­tion) or the entire body of IRA posts on social media for three years, this analysis is limited to an anec­dotal data collec­tion at an earlier stage of the 2020 elec­tions. While I found that similar to the 2016 case, the promo­tion of Trump’s agenda was preval­ent across these accounts, I also found an anti-Trump account and anti-Trump messages target­ing liberal voters.

In a similar vein, while I found a fake Sanders campaign account promot­ing Sanders, it is still prema­ture to conclude that Russia is help­ing him at this point. For example, The United Muslims of Amer­ica, one of the IRA groups that was active in the 2016 elec­tion, appeared to promote pro-Clin­ton agenda early on, but it later turned into one of the most anti-Clin­ton groups. More system­atic analysis examin­ing the full scope of the IRA activ­it­ies around the 2020 elec­tions are required.

However, it is very clear that as of Septem­ber 2019, the IRA-linked groups have already begun a system­atic campaign oper­a­tion to influ­ence the 2020 elec­tions on Face­book and Instagram. This includes target­ing liberal voters with attack messages on major candid­ates in such as Biden and Sen. Eliza­beth Warren.

The evol­u­tion of Russian tactics

Despite tech plat­forms’ imple­ment­a­tion of trans­par­ency meas­ures, it appears that the IRA tactics aimed at the 2020 elec­tions have become even more brazen than those from 2016.

Fraud­u­lent iden­tity

In 2016, the IRA created “shell groups” mimick­ing grass­roots advocacy groups, and in some cases, imper­son­at­ing candid­ates. Those fakes were relat­ively easy to detect, as an exam­in­a­tion often revealed that those shell groups exis­ted solely in Face­book Pages or external websites. They often used their own inven­ted logos, land­ing page addresses, and the like, even when they tried to mimic exist­ing domestic actors.

IRA attempts to influ­ence the 2020 elec­tion appear to have improved their mimicry, however. In the newer IRA posts, I saw fake accounts pretend to be a Demo­cratic candid­ate or an offi­cial campaign commit­tee, with only subtle changes in the names or land­ing page addresses that are harder to notice. For example, the IRA mimicked the offi­cial account of the Bernie Sanders campaign, “bernie2020,” by using similar names like “bernie.2020__”.

Use of domestic nonprofits’ iden­tit­ies

Among the IRA posts I reviewed that touched on the 2020 elec­tion, many use the iden­tity of legit­im­ate, relat­ively popu­lar nonprofits, polit­ical action commit­tees (PACs), or grass­roots organ­iz­a­tions, even in their original posts (not a repost from those groups).

Because I did not cross-compare all of the IRA’s posts and those of domestic groups histor­ic­ally, at this point, it is unclear whether the IRA is simply steal­ing names, logos, and mater­i­als already used by legit­im­ate organ­iz­a­tions, or unwit­ting collab­or­a­tion between those legit­im­ate organ­iz­a­tions and the IRA’s shell groups occurred.

However, it is worth noting that in 2016, even when the tech plat­forms were not impos­ing trans­par­ency meas­ures, the IRA never used exactly the same logo or name of already exist­ing, legit­im­ate, and active domestic groups while they mimicked the groups’ activ­it­ies.

Tech plat­forms’ policies against polit­ical campaigns by foreign actors (e.g., Face­book Pages need to verify their U.S.-based iden­tity) might have made Russian oper­a­tions adapt to the changes and evolve over time. By using domestic polit­ical groups’ iden­tit­ies and mater­i­als, it would have been easier for foreign actors to bypass tech plat­forms’ enforce­ment. The lack of severe punish­ment for the 2016 elec­tion inter­fer­ence also might have rein­forced such illi­cit beha­vior.

This tactic works favor­ably over­all for IRA elec­tion inter­fer­ence strategies that exploit exist­ing sharp polit­ical divides in our soci­ety, as it boosts the cred­ib­il­ity of messages and helps amplify them among the members and support­ers of the domestic groups. However, it certainly poses a great many chal­lenges to invest­ig­at­ors as well as tech plat­forms, as it is extremely diffi­cult to detect “foreign” elec­tion inter­fer­ence and coordin­a­tion.

Use of nonpolit­ical, commer­cial, domestic accounts and mater­i­als

Simil­arly, I noticed an appar­ent increase in the use of nonpolit­ical content and seem­ingly apolit­ical or commer­cial mater­i­als. While this tactic was util­ized in 2016, espe­cially to build audi­ences and support bases at an earlier stage of the IRA’s influ­ence campaigns, it was not as common as now. This tactic conceals the polit­ical nature of the large scope of influ­ence campaigns and its coordin­ated networks, disguising the true purpose of the campaigns.

What should we do?

The poten­tial adapt­a­tion and evol­u­tion of the foreign elec­tion inter­fer­ence tactics pose even more chal­lenges to those who care to protect our citizens and elec­tion systems. Ahead of the 2020 elec­tions, digital polit­ical advert­ising has become a hot potato. Twit­ter with­drew from selling polit­ical advert­ising alto­gether, and Google decided to limit microtar­get­ing options for candid­ate campaigns’ narrow-target­ing abil­ity. Face­book has announced no major change in polit­ical ad policies, although it has made its trans­par­ency tools easier to use.

These recent devel­op­ments have made some at both ends of the polit­ical spec­trum unhappy. Yet no laws have been enacted to promote elec­tion integ­rity on digital plat­forms. Three years after Russi­a’s inter­fer­ence in the 2016 elec­tions came to light, we are still debat­ing what we should do to prevent mali­cious actors’ disin­form­a­tion campaigns from target­ing our elec­tions — espe­cially sweep­ing, system­atic foreign elec­tion inter­fer­ence campaigns on digital plat­forms used by ordin­ary people in their daily lives.

A compre­hens­ive digital campaign policy frame­work must be considered to ensure the integ­rity of elec­tion campaigns. Just ahead of the elec­tions, unfor­tu­nately, no such regu­lat­ory policy exists.

Enhance trans­par­ency and monit­or­ing mech­an­isms

Tech plat­forms must do a better job at trans­par­ency, includ­ing iden­tity veri­fic­a­tion and labeling. This would help reduce foreign actors’ produc­tion of “shell groups.”

It’s still too hard to discern foreign and domestic actors, false iden­tit­ies, and poten­tial coordin­a­tion between vari­ous groups, espe­cially because of the tactics discovered in this analysis, such as the use of domestic groups’ logos and iden­tit­ies in the original posts. Plat­forms like Face­book aggress­ively protect intel­lec­tual prop­erty, for example with tech­no­logy that auto­mat­ic­ally blocks posts that contain copy­righted mater­ial like songs. The plat­forms should consider apply­ing similar tools to the polit­ical arena to block fake accounts from steal­ing the logos and brand iden­tity of candid­ates and groups. 

Enhance the Foreign Agent Regis­tra­tion Act (FARA)

The law requires that agents repres­ent­ing the interest of foreign powers in a polit­ical or “quasi-polit­ical” capa­city must disclose their rela­tion­ship with the foreign power. While FARA focuses on lobby­ing activ­it­ies, it should acknow­ledge the changes in the nature of foreign influ­ences in the digital era. Stronger regu­la­tion under FARA, includ­ing enhanced monit­or­ing mech­an­isms, could help.

Enact rules for digital polit­ical campaigns

While the Federal Elec­tion Commis­sion provides the rules and policies for polit­ical commit­tees, its current policies do not adequately address digital polit­ical campaigns in general. For instance, a clear and consist­ent defin­i­tion of polit­ical advert­ising must be provided across tech plat­forms to make trans­par­ency meas­ures more effect­ive. Like­wise, compre­hens­ive, cross-plat­form archives of polit­ical campaigns, includ­ing issue ads and target inform­a­tion, would help audi­ences, law enforce­ment, and research­ers under­stand what’s happen­ing in digital polit­ical advert­ising.

Without safe­guards like these, Russia and other foreign govern­ments will continue their efforts to manip­u­late Amer­ican elec­tions and under­mine our demo­cracy.

Young Mie Kim is a professor at the Univer­sity of Wiscon­sin-Madison and a Bren­nan Center for Justice affil­i­ated scholar.