Skip Navigation
Resource

Rating the Privacy Protections of State Covid-19 Tracking Apps

When apps use Bluetooth or GPS to notify users about potential exposures to coronavirus, the data that may be collected along the way is a concern.

Some 24 states are devel­op­ing or have released smart­phone apps that can inform people that they may have been exposed to someone with Covid-19. This tech­no­logy could be a valu­able supple­ment to tradi­tional contact tracers, who reach out person­ally to people who have been in contact with a confirmed Covid-19 case to notify them of the expos­ure and to coun­sel them on next steps and resources.

It remains to be seen how much these apps will contrib­ute to combat­ing the pandemic in the United States — the results in other coun­tries have been mixed so far. If states do adopt and promote these tools, it is imper­at­ive they prior­it­ize user privacy and provide maximum trans­par­ency. Not only will this protect users, but it will encour­age adop­tion, which public health experts agree is crit­ical to their effect­ive­ness. Indeed, while none of the apps in the United States has been widely embraced by the public to date, states like Utah and North Dakota with less privacy protect­ive models have faced addi­tional diffi­culties obtain­ing public support and buy-in, likely due in part to privacy concerns.

App Features

Covid-19 apps gener­ally perform one or more of the follow­ing func­tions: expos­ure noti­fic­a­tion, symp­tom track­ing, or loca­tion logging.

Expos­ure noti­fic­a­tion alerts users if they have been within a certain distance of someone confirmed posit­ive for Covid-19, usually within the preced­ing 14 days.

Symp­tom track­ing allows users to peri­od­ic­ally submit inform­a­tion about their health status to public health author­it­ies.

Loca­tion diar­ies are designed to record several weeks of a user’s loca­tion inform­a­tion, typic­ally obtained from their cell phone’s GPS. Depend­ing on the app, if a user later tests posit­ive for Covid-19, their loca­tion diary may be shared with contact tracers directly or be employed by the user during inter­views with contact tracers to remind the user of his or her past move­ments.

Some apps contain addi­tional features, such as convey­ing inform­a­tion from public health author­it­ies about trans­mis­sion rates or test­ing sites.

The privacy protec­tions of these apps vary widely, as does the level of trans­par­ency provided by differ­ent states. Apps built on the Bluetooth Expos­ure Noti­fic­a­tion plat­form created by Google and Apple often include more robust privacy protec­tions, in part because of the system’s inher­ent design. Apps that rely on other plat­forms are signi­fic­antly less protect­ive, often featur­ing GPS loca­tion track­ing and fewer restric­tions on data use and reten­tion.

Certain state apps, such as those in Arizona and Alabama, have clear privacy policies avail­able online, through state public health agen­cies or the app’s own website. For others — such as Wash­ing­ton State, which will be pilot­ing the new Expos­ure Noti­fic­a­tion Express model released by Google and Apple — it is diffi­cult to find any reli­able inform­a­tion. And even apps with clear privacy policies frequently do not disclose what happens to data that users choose to share with state public health agen­cies — for example, whether the inform­a­tion might be shared with other states or with law enforce­ment or immig­ra­tion author­it­ies.

Google/Apple Expos­ure Noti­fic­a­tion Systems

The Google/Apple Expos­ure Noti­fic­a­tion system, which encom­passes both an original version released in May and a new “Express” model launched in Septem­ber, is the most widely used app plat­form. It features the strongest privacy protec­tions for users given its sole reli­ance on Bluetooth track­ing, limited personal data collec­tion, and limited data reten­tion and shar­ing policies. Some of these protec­tions, such as user anonym­ity and the preven­tion of GPS loca­tion track­ing, are built directly into the Google/Apple API and cannot be changed, ensur­ing that these safe­guards are preserved across the states using the plat­form.

The major­ity of state apps employ the original Google/Apple Bluetooth Expos­ure Noti­fic­a­tion (GAEN) plat­form. This decent­ral­ized system works by allow­ing nearby iOS and Android devices to exchange random­ized, anonym­ous Bluetooth iden­ti­fi­ers, which are stored locally on indi­vidual devices. If a user tests posit­ive for Covid-19, their local public health agency will provide them with a code they can upload to the app, which will mark their Bluetooth iden­ti­fi­ers as belong­ing to someone confirmed posit­ive for Covid-19. This will in turn result in anonym­ous noti­fic­a­tions being sent to the devices of those they may have exposed.

Initially, Google and Apple required states using their plat­form to develop their own unique app inter­faces. At least 14 states — AlabamaArizonaDelawareHawaiiNevadaNew Jersey, New YorkNorth Caro­linaMichiganNorth DakotaOklahomaPennsylvaniaVirginia, and Wyom­ing —have released or are devel­op­ing apps util­iz­ing the GAEN system.

In Septem­ber, the compan­ies announced they would be releas­ing an Expos­ure Noti­fic­a­tions Express (ENE) system to make it easier for public health author­it­ies to use the plat­form without incur­ring the costs of build­ing, main­tain­ing, or promot­ing an app. Users in states or regions that enable the express system will receive an auto­matic prompt on their cell phones alert­ing them that the tool is avail­able. Apple users can tap through auto­mat­ic­ally to parti­cip­ate, while Android users will be instruc­ted to down­load an app from the Google Play store. Six states — Cali­for­niaColor­adoConnecti­cutMary­landOregon, and Wash­ing­ton have commit­ted to test­ing the app-less ENE system. The District of Columbia released its ENE system, called DC CAN, in late Octo­ber. Nevada and Virginia may also test the ENE model accord­ing to some reports, despite having released their own apps.

The apps on the GAEN plat­form are among the most privacy protect­ive in the United States. They gener­ally share many of the same privacy-preserving char­ac­ter­ist­ics, includ­ing that they:

  • Are premised on informed, volun­tary, opt-in consent from users
  • Rely solely on Bluetooth, which is designed to determ­ine prox­im­ity inform­a­tion, as opposed to GPS or other more sens­it­ive loca­tion data
  • Store Bluetooth data locally on user phones in the form of anonym­ous iden­ti­fi­ers
  • Do not require users to disclose person­ally iden­ti­fi­able inform­a­tion, such as their name, date of birth, address, or email
  • Give users who test posit­ive for Covid-19 the oppor­tun­ity to upload to the app a code they receive from their public health author­it­ies, thereby trig­ger­ing noti­fic­a­tions to possibly exposed indi­vidu­als
  • Auto­mat­ic­ally delete locally stored user data if not used, usually within 14 days from the date of collec­tion
  • Permit users, at any time, to delete their locally stored data or to revoke their parti­cip­a­tion in the program by delet­ing the app from their phone.

These privacy features, some of which are built into the GAEN model itself, provide import­ant baseline protec­tions for users in the event of a hack or data breach by limit­ing the type of data collec­ted and mandat­ing auto­matic dele­tion of data after a certain time period. Moreover, users retain at least some control over their personal data in that they can choose whether to upload a diagnosis code to the app, and are also able to delete their locally stored data or decide to discon­tinue parti­cip­a­tion in the program at any time.

Despite these simil­ar­it­ies and the back end, tech­nical uniform­ity of the GAEN model with respect to its Bluetooth noti­fic­a­tion system, states have discre­tion in shap­ing how users inter­act with their apps. Because supple­mental features can create addi­tional privacy vulner­ab­il­it­ies, vari­ations remain in the actual privacy protec­tions enjoyed by users — partic­u­larly in rela­tion to the breadth of data collec­ted.

Delaware, New York, New Jersey, and Pennsylvania are among the states that have added optional demo­graphic and symp­tom ques­tions to their GAEN apps, giving users the oppor­tun­ity to enter inform­a­tion about their county, gender, age, and race, as well as to answer ques­tions about possible symp­toms. This might make it possible for state offi­cials to identify or draw infer­ences about users, partic­u­larly in places where indi­vidu­als are members of under­rep­res­en­ted demo­graphic groups. For example, the New Jersey symp­tom tracker, called “COVID Check In,” asks users to identify their county, age range, race, ethni­city, gender, and sexual orient­a­tion. It is quite possible that public health offi­cials might be able to draw infer­ences about the iden­tity and char­ac­ter­ist­ics of a partic­u­lar user who reports identi­fy­ing as Asian Amer­ican living in Ocean County, where the popu­la­tion was less than 2 percent Asian accord­ing to the 2010 census. Request­ing addi­tional personal inform­a­tion under­mines a cent­ral tenet of the GAEN plat­form — user anonym­ity — and could sow distrust in the app, partic­u­larly if the inform­a­tion is shared with law enforce­ment or other third parties.

States also control key features, such as defin­ing what consti­tutes an expos­ure event and determ­in­ing what inform­a­tion is trans­mit­ted to users via expos­ure noti­fic­a­tions. Most states define an expos­ure event as occur­ring when two devices are within six feet for 15 minutes or longer, though some, like New York, use a shorter dura­tional window of 10 minutes. Some states, like Arizonahave chosen to person­al­ize health recom­mend­a­tions and expos­ure noti­fic­a­tions to match someone’s appar­ent risk of contract­ing Covid-19 based on a number of factors — includ­ing the dura­tion of their expos­ure, distance from an infec­ted person, and at what point during the other person’s infec­tion they were exposed. Thus, the app might recom­mend someone stay home for a period shorter than 14 days if their expos­ure was far in the past.

There is a poten­tial privacy risk arising from the metadata gener­ated by most apps built on the GAEN plat­form. The default setting is gener­ally that users consent to share metadata, such as inform­a­tion about app install­a­tion, dele­tion, and crash­ing, with state serv­ers, presum­ably to assist with tech­nical support. To the extent this results in frequent trans­mis­sions between user devices and the serv­ers, the metadata might reveal a user’s general loca­tion in the form of a phone’s IP address. It could also conceiv­ably be used to link anonym­ous Bluetooth iden­ti­fi­ers with specific indi­vidu­als, result­ing in the iden­ti­fic­a­tion of users and their infec­tion status without their consent.

Recog­niz­ing this, a few states go further in provid­ing addi­tional, affirm­at­ive privacy protec­tions for users beyond those provided for within the GAEN frame­work. For example, Pennsylvania and Nevada are among the few states that expli­citly provide in their privacy policies that they either will not retain user IP addresses or will ensure that IP address inform­a­tion cannot be asso­ci­ated with indi­vidual users.

Most of the apps’ privacy policies do not address what happens to user data once it is volun­tar­ily shared with state health author­it­ies. In July, the New York State Legis­lature passed a contact tracing privacy bill that would simil­arly limit inform­a­tion shar­ing between public health agen­cies and police and immig­ra­tion enforce­ment. The bill is await­ing the governor’s signa­ture.

Although Apple and Google’s Bluetooth expos­ure noti­fic­a­tion systems are recog­nized to be more privacy protect­ive than the other app plat­forms being used in the United States, it is clear that there remains signi­fic­ant vari­ation in the actual protec­tions enjoyed by users. This is partic­u­larly true with respect to the minim­iz­a­tion of data collec­tion and what happens to app data after a user decides to share it with public health offi­cials.

Other Models

While the Google/Apple model is most preval­ent, states and muni­cipal author­it­ies have deployed a hand­ful of other Covid-19 apps. These, too, are premised on volun­tary adop­tion and opt-in data shar­ing, but they are substan­tially less privacy protect­ive, mainly because they collect sens­it­ive loca­tion data.

Six states currently use or are prepar­ing to launch Covid-19 apps that do not use the Google/Apple tech­no­logy, includ­ing HawaiiNorth DakotaRhode IslandSouth DakotaUtah, and Wyom­ing. Some of these apps work in conjunc­tion with apps that do use the Google/Apple plat­form but have addi­tional features, like loca­tion track­ing, that are not permit­ted on that plat­form. Hawaii, for instance, has developed a loca­tion diary app that tracks user loca­tion and symp­toms. The app, AlohaSafe Story, runs off the open-source, MIT-led plat­form Safe Paths. In some states where there is no statewide app, includ­ing Geor­gia and Flor­ida, county health depart­ments are launch­ing their own Covid-19 apps.

Most of the state apps built on models other than the GAEN plat­form have GPS or cell tower loca­tion track­ing func­tion­al­it­ies. These are some­times imple­men­ted in conjunc­tion with Bluetooth expos­ure noti­fic­a­tion. For example, North Dakota, South Dakota, and Wyom­ing all use the same loca­tion diary app, Care19 Diary. When users opt into loca­tion track­ing, these apps begin collect­ing loca­tion data every few minutes. Then, if a user tests posit­ive for Covid-19, they have the option to share the data with their local health author­it­ies.

As the Bren­nan Center has noted before, cell tower and GPS loca­tion track­ing pose greater privacy risks to users than Bluetooth prox­im­ity track­ing, which only records close­ness to others. As Justice Sonia Soto­mayor has cautioned, “GPS monit­or­ing gener­ates a precise, compre­hens­ive record of a person’s public move­ments that reflects a wealth of detail about her familial, polit­ical, profes­sional, reli­gious, and sexual asso­ci­ations.” Detailed, personal inform­a­tion can easily be extrac­ted from loca­tion data, and even anonym­ous loca­tion data can often be used to re-identify an indi­vidual, given the specificity of inform­a­tion about move­ments that GPS data provides. Moreover, beyond the privacy implic­a­tions, loca­tion track­ing has reduced util­ity for expos­ure noti­fic­a­tion because it cannot estab­lish, for example, when two indi­vidu­als are 6 feet apart.

In some cases, concern around loca­tion track­ing in Covid-19 apps has led to signi­fic­antly lowered usage rates. Utah’s Healthy Together app, which launched in late April, initially used both GPS loca­tion track­ing and Bluetooth tech­no­logy to track contact between users. However, state lead­ers soon with­drew the GPS func­tion after only some 200 users opted to share loca­tion data with public health offi­cials. The state’s head epidemi­olo­gist attrib­uted the low adop­tion rate to the unpop­ular­ity of shar­ing loca­tion data. Even as the state contin­ues to use Healthy Together, it is also accept­ing applic­a­tions for new compan­ies to develop a Bluetooth prox­im­ity app for the state.

Data Shar­ing

While all current statewide Covid-19 apps are opt-in and only share data with public offi­cials with user consent, they are often vague about how inform­a­tion collec­ted for public health purposes will be shared.

For exampleRhode Island’s CRUSH COVID RI privacy policy provides ambigu­ous guid­ance regard­ing when the state may share personal inform­a­tion with third parties, saying: “We may also be required to disclose your inform­a­tion to third parties, such as where we are legally required to disclose your inform­a­tion, includ­ing, but not limited to, pursu­ant to a valid subpoena, order, search warrant or other law enforce­ment request.” CRUSH COVID RI leaves the door open for personal inform­a­tion to be shared with law enforce­ment or immig­ra­tion author­it­ies.

The Care19 Diary app used by North Dakota, South Dakota, and Wyom­ing faced public back­lash after it was discovered to be viol­at­ing its own privacy policy by shar­ing unique user iden­ti­fi­ers with third parties. When apps permit inform­a­tion shar­ing with third parties other than public health author­it­ies, the prac­tice may harm user trust and adop­tion rates. It also facil­it­ates the use of sens­it­ive Covid-19 data in myriad ways without account­ab­il­ity or mean­ing­ful user consent, includ­ing to possibly target advert­ising or to deny someone insur­ance. While Care19 allegedly no longer shares unique user iden­ti­fi­ers with third parties, the incid­ent high­lights how over­sights in tech­nical infra­struc­ture can pose privacy risks to users, partic­u­larly when detailed loca­tion data is collec­ted.

Private Apps

Some cities and counties have announced part­ner­ships with Covid-19 apps that are privately owned and managed, though these apps have yet to be embraced by states. These apps exhibit substan­tial privacy risks, which cautions against their adop­tion and use.

The best-known app in this category is Safe­Pass, which was developed by Citizen, the company behind a popu­lar crime report­ing app. Citizen launched Safe­Pass in August, advert­ising the app to its active user base of over 5 million. Safe­Pass is gain­ing signi­fic­ant trac­tion among local govern­ments, in part because the Citizen app has a large, pre-exist­ing user base and Citizen had already tested Safe­Pass in a pilot­with over 700,000 users prior to the app’s offi­cial launch. The health depart­ments of Los Angeles County and San Joaquin County in Cali­for­nia both announced part­ner­ships with Citizen Safe­Pass in the past two months, encour­aging adop­tion among over a combined 11 million resid­ents between the two counties. In compar­ison to the pilot, in the month after Virginia released its Covid-19 app, the app was down­loaded less than half a million times. New York and New Jersey’s apps were down­loaded 630,000 and 250,000 times respect­ively in the first weeks after their launch.

But Safe­Pass poses consid­er­able risks. It relies on sens­it­ive loca­tion data, increases the chances of user reiden­ti­fic­a­tion by link­ing Safe­Pass accounts to accounts on the Citizen crime alert app, and does not adequately safe­guard users’ health inform­a­tion.

Safe­Pass uses both Bluetooth prox­im­ity and GPS loca­tion track­ing for its expos­ure noti­fic­a­tion system and specifies that GPS loca­tion track­ing must be enabled for the app to func­tion. Further­more, the Safe­Pass privacy policy notes that currently only users with Citizen accounts can use Safe­Pass because it relies on the same GPS loca­tion track­ing soft­ware. As a result, Citizen links Citizen and Safe­Pass user accounts, giving the company access to the inform­a­tion in the main Citizen user data­base and linked personal health inform­a­tion. This increases the possib­il­ity of reiden­ti­fic­a­tion of Safe­Pass users and could poten­tially make it easier for Citizen or third parties to view or request Safe­Pass user data through the Citizen data­bases.

With regards to data reten­tion and data shar­ing with third parties, Citizen Safe­Pass prom­ises to delete only loca­tion data, Bluetooth data, and iden­tity veri­fic­a­tion data. All other personal inform­a­tion (includ­ing contact inform­a­tion, health inform­a­tion, and user activ­ity on the app) can be kept “for the period neces­sary to fulfill the purposes outlined in this policy and to support other app features you might use.” Broad data reten­tion policies increase the like­li­hood of a data breach, like the Securus Tech­no­lo­gies leak that exposed 70 million inmate phone calls in 2015.

In addi­tion, while Citizen assures users that the company will require user consent to share personal inform­a­tion with law enforce­ment, the company carves out several excep­tions for itself, includ­ing that it may share personal inform­a­tion with law enforce­ment to “protect, invest­ig­ate and deter against fraud­u­lent, harm­ful, unau­thor­ized, uneth­ical or illegal activ­ity.” This could facil­it­ate the shar­ing of inform­a­tion in response to whatever beha­vior the company deems “uneth­ical” — for example, it could disclose to law enforce­ment the data of users who are not prac­ti­cing social distan­cing. Taken together, these policies permit Citizen to collect large amounts of detailed, person­ally iden­ti­fi­able inform­a­tion while main­tain­ing signi­fic­ant discre­tion in how that inform­a­tion is stored and shared.

Well­b­il­ity, an app built by New Hamp­shire company OnSite Medical Services, is another private Covid-19 app with poor privacy protec­tions. Unlike most other apps, Well­b­il­ity requires users to sign up and provide personal inform­a­tion, includ­ing their name and date of birth, to begin to use the app. Well­b­il­ity’s Terms of Use leave the company with signi­fic­ant author­ity over what data is collec­ted, how it is shared, and how long it is retained. The data collec­tion section concludes by requir­ing users to agree to the “extens­ive” collec­tion of loca­tion data, personal inform­a­tion, health inform­a­tion and inter­ac­tions with other users, and it notes that mech­an­isms for collect­ing data “are not limited to means mentioned herein.” This could poten­tially allow Well­b­il­ity to use metadata like IP addresses to track user activ­ity even when users are not using the app. Well­b­il­ity’s data use and shar­ing policies are simil­arly vague and broadly allow the company to disclose inform­a­tion “as needed to oper­ate other related services,” for targeted advert­ising, and to “achieve commer­cial health ventures.” Finally, Well­b­il­ity stores personal data for over a year, unlike most other apps, which delete data after 14 days if unused.

In short, privately owned and managed Covid-19 apps like Safe­Pass and Well­b­il­ity offer the lowest level of privacy protec­tions for users, mean­ing that users are more likely to be vulner­able to data breaches, reiden­ti­fic­a­tion, or data shar­ing for reas­ons users did not consent to. By mixing profit and public health object­ives, private Covid-19 apps expose users to heightened risks in the state expos­ure noti­fic­a­tion ecosys­tem.

Look­ing Forward

State Covid-19 apps are still in their infancy, mean­ing the full extent of their privacy implic­a­tions is unknown. However, there are several key ques­tions that most states have yet to suffi­ciently resolve, regard­less of which model they have adop­ted.

One of the most press­ing is the privacy protec­tions in place for data that users choose to share with their state depart­ment of health. In most cases it is unclear whether this data can or will be shared with other agen­cies, other states, or law enforce­ment. Another open ques­tion is the extent to which state apps can success­fully minim­ize the collec­tion of data. For the GAEN apps, this gener­ally involves redu­cing or elim­in­at­ing the collec­tion of metadata, such as IP addresses. Apps built on another plat­form present seri­ous privacy prob­lems because of their reli­ance on highly sens­it­ive time-stamped GPS data, which is not needed to notify people of expos­ure to Covid-19, since Bluetooth prox­im­ity data suffices.

In the coming months, states should continue to crit­ic­ally eval­u­ate their Covid-19 apps and share key data points to better inform public discus­sion about the apps’ privacy risks and effect­ive­ness. These efforts should include:

  • Prior­it­iz­ing trans­par­ency, includ­ing provid­ing inform­a­tion about app developers and their contracts with the state, as well as disclos­ing whether data collec­ted by the app will be used for purposes other than expos­ure noti­fic­a­tion or shared with entit­ies other than public health offi­cials
  • Main­tain­ing and improv­ing privacy safe­guards, includ­ing by minim­iz­ing data collec­tion whenever possible
  • Releas­ing inform­a­tion about the tech­nical aspects of the app, includ­ing how the app will determ­ine whether there has been an expos­ure event and what noti­fic­a­tions users will receive
  • Regu­larly audit­ing the app’s uptake and effect­ive­ness — for example, by eval­u­at­ing its adop­tion and the rate of accur­ate expos­ure noti­fic­a­tions, to justify contin­ued use.

It is incum­bent on state health depart­ments to set forth clear privacy policies in order to foster public trust. While Covid-19 apps will neces­sar­ily involve some tradeoff between public health and user confid­en­ti­al­ity, it is import­ant that states work to minim­ize privacy harms to encour­age the util­iz­a­tion of their apps and protect their resid­ents. Recog­niz­ing this, states must be vigil­ant in audit­ing their apps and prior­it­iz­ing trans­par­ency and public account­ab­il­ity.