Skip Navigation
Resource

Rating the Privacy Protections of State Covid-19 Tracking Apps

When apps use Bluetooth or GPS to notify users about potential exposures to coronavirus, the data that may be collected along the way is a concern.

Some 24 states are developing or have released smartphone apps that can inform people that they may have been exposed to someone with Covid-19. This technology could be a valuable supplement to traditional contact tracers, who reach out personally to people who have been in contact with a confirmed Covid-19 case to notify them of the exposure and to counsel them on next steps and resources.

It remains to be seen how much these apps will contribute to combating the pandemic in the United States — the results in other countries have been mixed so far. If states do adopt and promote these tools, it is imperative they prioritize user privacy and provide maximum transparency. Not only will this protect users, but it will encourage adoption, which public health experts agree is critical to their effectiveness. Indeed, while none of the apps in the United States has been widely embraced by the public to date, states like Utah and North Dakota with less privacy protective models have faced additional difficulties obtaining public support and buy-in, likely due in part to privacy concerns.

App Features

Covid-19 apps generally perform one or more of the following functions: exposure notification, symptom tracking, or location logging.

Exposure notification alerts users if they have been within a certain distance of someone confirmed positive for Covid-19, usually within the preceding 14 days.

Symptom tracking allows users to periodically submit information about their health status to public health authorities.

Location diaries are designed to record several weeks of a user’s location information, typically obtained from their cell phone’s GPS. Depending on the app, if a user later tests positive for Covid-19, their location diary may be shared with contact tracers directly or be employed by the user during interviews with contact tracers to remind the user of his or her past movements.

Some apps contain additional features, such as conveying information from public health authorities about transmission rates or testing sites.

The privacy protections of these apps vary widely, as does the level of transparency provided by different states. Apps built on the Bluetooth Exposure Notification platform created by Google and Apple often include more robust privacy protections, in part because of the system’s inherent design. Apps that rely on other platforms are significantly less protective, often featuring GPS location tracking and fewer restrictions on data use and retention.

Certain state apps, such as those in Arizona and Alabama, have clear privacy policies available online, through state public health agencies or the app’s own website. For others — such as Washington State, which will be piloting the new Exposure Notification Express model released by Google and Apple — it is difficult to find any reliable information. And even apps with clear privacy policies frequently do not disclose what happens to data that users choose to share with state public health agencies — for example, whether the information might be shared with other states or with law enforcement or immigration authorities.

Google/Apple Exposure Notification Systems

The Google/Apple Exposure Notification system, which encompasses both an original version released in May and a new “Express” model launched in September, is the most widely used app platform. It features the strongest privacy protections for users given its sole reliance on Bluetooth tracking, limited personal data collection, and limited data retention and sharing policies. Some of these protections, such as user anonymity and the prevention of GPS location tracking, are built directly into the Google/Apple API and cannot be changed, ensuring that these safeguards are preserved across the states using the platform.

The majority of state apps employ the original Google/Apple Bluetooth Exposure Notification (GAEN) platform. This decentralized system works by allowing nearby iOS and Android devices to exchange randomized, anonymous Bluetooth identifiers, which are stored locally on individual devices. If a user tests positive for Covid-19, their local public health agency will provide them with a code they can upload to the app, which will mark their Bluetooth identifiers as belonging to someone confirmed positive for Covid-19. This will in turn result in anonymous notifications being sent to the devices of those they may have exposed.

Initially, Google and Apple required states using their platform to develop their own unique app interfaces. At least 14 states — AlabamaArizonaDelawareHawaiiNevadaNew Jersey, New YorkNorth CarolinaMichiganNorth DakotaOklahomaPennsylvaniaVirginia, and Wyoming —have released or are developing apps utilizing the GAEN system.

In September, the companies announced they would be releasing an Exposure Notifications Express (ENE) system to make it easier for public health authorities to use the platform without incurring the costs of building, maintaining, or promoting an app. Users in states or regions that enable the express system will receive an automatic prompt on their cell phones alerting them that the tool is available. Apple users can tap through automatically to participate, while Android users will be instructed to download an app from the Google Play store. Six states — CaliforniaColoradoConnecticutMarylandOregon, and Washington have committed to testing the app-less ENE system. The District of Columbia released its ENE system, called DC CAN, in late October. Nevada and Virginia may also test the ENE model according to some reports, despite having released their own apps.

The apps on the GAEN platform are among the most privacy protective in the United States. They generally share many of the same privacy-preserving characteristics, including that they:

  • Are premised on informed, voluntary, opt-in consent from users
  • Rely solely on Bluetooth, which is designed to determine proximity information, as opposed to GPS or other more sensitive location data
  • Store Bluetooth data locally on user phones in the form of anonymous identifiers
  • Do not require users to disclose personally identifiable information, such as their name, date of birth, address, or email
  • Give users who test positive for Covid-19 the opportunity to upload to the app a code they receive from their public health authorities, thereby triggering notifications to possibly exposed individuals
  • Automatically delete locally stored user data if not used, usually within 14 days from the date of collection
  • Permit users, at any time, to delete their locally stored data or to revoke their participation in the program by deleting the app from their phone.

These privacy features, some of which are built into the GAEN model itself, provide important baseline protections for users in the event of a hack or data breach by limiting the type of data collected and mandating automatic deletion of data after a certain time period. Moreover, users retain at least some control over their personal data in that they can choose whether to upload a diagnosis code to the app, and are also able to delete their locally stored data or decide to discontinue participation in the program at any time.

Despite these similarities and the back end, technical uniformity of the GAEN model with respect to its Bluetooth notification system, states have discretion in shaping how users interact with their apps. Because supplemental features can create additional privacy vulnerabilities, variations remain in the actual privacy protections enjoyed by users — particularly in relation to the breadth of data collected.

Delaware, New York, New Jersey, and Pennsylvania are among the states that have added optional demographic and symptom questions to their GAEN apps, giving users the opportunity to enter information about their county, gender, age, and race, as well as to answer questions about possible symptoms. This might make it possible for state officials to identify or draw inferences about users, particularly in places where individuals are members of underrepresented demographic groups. For example, the New Jersey symptom tracker, called “COVID Check In,” asks users to identify their county, age range, race, ethnicity, gender, and sexual orientation. It is quite possible that public health officials might be able to draw inferences about the identity and characteristics of a particular user who reports identifying as Asian American living in Ocean County, where the population was less than 2 percent Asian according to the 2010 census. Requesting additional personal information undermines a central tenet of the GAEN platform — user anonymity — and could sow distrust in the app, particularly if the information is shared with law enforcement or other third parties.

States also control key features, such as defining what constitutes an exposure event and determining what information is transmitted to users via exposure notifications. Most states define an exposure event as occurring when two devices are within six feet for 15 minutes or longer, though some, like New York, use a shorter durational window of 10 minutes. Some states, like Arizonahave chosen to personalize health recommendations and exposure notifications to match someone’s apparent risk of contracting Covid-19 based on a number of factors — including the duration of their exposure, distance from an infected person, and at what point during the other person’s infection they were exposed. Thus, the app might recommend someone stay home for a period shorter than 14 days if their exposure was far in the past.

There is a potential privacy risk arising from the metadata generated by most apps built on the GAEN platform. The default setting is generally that users consent to share metadata, such as information about app installation, deletion, and crashing, with state servers, presumably to assist with technical support. To the extent this results in frequent transmissions between user devices and the servers, the metadata might reveal a user’s general location in the form of a phone’s IP address. It could also conceivably be used to link anonymous Bluetooth identifiers with specific individuals, resulting in the identification of users and their infection status without their consent.

Recognizing this, a few states go further in providing additional, affirmative privacy protections for users beyond those provided for within the GAEN framework. For example, Pennsylvania and Nevada are among the few states that explicitly provide in their privacy policies that they either will not retain user IP addresses or will ensure that IP address information cannot be associated with individual users.

Most of the apps’ privacy policies do not address what happens to user data once it is voluntarily shared with state health authorities. In July, the New York State Legislature passed a contact tracing privacy bill that would similarly limit information sharing between public health agencies and police and immigration enforcement. The bill is awaiting the governor’s signature.

Although Apple and Google’s Bluetooth exposure notification systems are recognized to be more privacy protective than the other app platforms being used in the United States, it is clear that there remains significant variation in the actual protections enjoyed by users. This is particularly true with respect to the minimization of data collection and what happens to app data after a user decides to share it with public health officials.

Other Models

While the Google/Apple model is most prevalent, states and municipal authorities have deployed a handful of other Covid-19 apps. These, too, are premised on voluntary adoption and opt-in data sharing, but they are substantially less privacy protective, mainly because they collect sensitive location data.

Six states currently use or are preparing to launch Covid-19 apps that do not use the Google/Apple technology, including HawaiiNorth DakotaRhode IslandSouth DakotaUtah, and Wyoming. Some of these apps work in conjunction with apps that do use the Google/Apple platform but have additional features, like location tracking, that are not permitted on that platform. Hawaii, for instance, has developed a location diary app that tracks user location and symptoms. The app, AlohaSafe Story, runs off the open-source, MIT-led platform Safe Paths. In some states where there is no statewide app, including Georgia and Florida, county health departments are launching their own Covid-19 apps.

Most of the state apps built on models other than the GAEN platform have GPS or cell tower location tracking functionalities. These are sometimes implemented in conjunction with Bluetooth exposure notification. For example, North Dakota, South Dakota, and Wyoming all use the same location diary app, Care19 Diary. When users opt into location tracking, these apps begin collecting location data every few minutes. Then, if a user tests positive for Covid-19, they have the option to share the data with their local health authorities.

As the Brennan Center has noted before, cell tower and GPS location tracking pose greater privacy risks to users than Bluetooth proximity tracking, which only records closeness to others. As Justice Sonia Sotomayor has cautioned, “GPS monitoring generates a precise, comprehensive record of a person’s public movements that reflects a wealth of detail about her familial, political, professional, religious, and sexual associations.” Detailed, personal information can easily be extracted from location data, and even anonymous location data can often be used to re-identify an individual, given the specificity of information about movements that GPS data provides. Moreover, beyond the privacy implications, location tracking has reduced utility for exposure notification because it cannot establish, for example, when two individuals are 6 feet apart.

In some cases, concern around location tracking in Covid-19 apps has led to significantly lowered usage rates. Utah’s Healthy Together app, which launched in late April, initially used both GPS location tracking and Bluetooth technology to track contact between users. However, state leaders soon withdrew the GPS function after only some 200 users opted to share location data with public health officials. The state’s head epidemiologist attributed the low adoption rate to the unpopularity of sharing location data. Even as the state continues to use Healthy Together, it is also accepting applications for new companies to develop a Bluetooth proximity app for the state.

Data Sharing

While all current statewide Covid-19 apps are opt-in and only share data with public officials with user consent, they are often vague about how information collected for public health purposes will be shared.

For exampleRhode Island’s CRUSH COVID RI privacy policy provides ambiguous guidance regarding when the state may share personal information with third parties, saying: “We may also be required to disclose your information to third parties, such as where we are legally required to disclose your information, including, but not limited to, pursuant to a valid subpoena, order, search warrant or other law enforcement request.” CRUSH COVID RI leaves the door open for personal information to be shared with law enforcement or immigration authorities.

The Care19 Diary app used by North Dakota, South Dakota, and Wyoming faced public backlash after it was discovered to be violating its own privacy policy by sharing unique user identifiers with third parties. When apps permit information sharing with third parties other than public health authorities, the practice may harm user trust and adoption rates. It also facilitates the use of sensitive Covid-19 data in myriad ways without accountability or meaningful user consent, including to possibly target advertising or to deny someone insurance. While Care19 allegedly no longer shares unique user identifiers with third parties, the incident highlights how oversights in technical infrastructure can pose privacy risks to users, particularly when detailed location data is collected.

Private Apps

Some cities and counties have announced partnerships with Covid-19 apps that are privately owned and managed, though these apps have yet to be embraced by states. These apps exhibit substantial privacy risks, which cautions against their adoption and use.

The best-known app in this category is SafePass, which was developed by Citizen, the company behind a popular crime reporting app. Citizen launched SafePass in August, advertising the app to its active user base of over 5 million. SafePass is gaining significant traction among local governments, in part because the Citizen app has a large, pre-existing user base and Citizen had already tested SafePass in a pilotwith over 700,000 users prior to the app’s official launch. The health departments of Los Angeles County and San Joaquin County in California both announced partnerships with Citizen SafePass in the past two months, encouraging adoption among over a combined 11 million residents between the two counties. In comparison to the pilot, in the month after Virginia released its Covid-19 app, the app was downloaded less than half a million times. New York and New Jersey’s apps were downloaded 630,000 and 250,000 times respectively in the first weeks after their launch.

But SafePass poses considerable risks. It relies on sensitive location data, increases the chances of user reidentification by linking SafePass accounts to accounts on the Citizen crime alert app, and does not adequately safeguard users’ health information.

SafePass uses both Bluetooth proximity and GPS location tracking for its exposure notification system and specifies that GPS location tracking must be enabled for the app to function. Furthermore, the SafePass privacy policy notes that currently only users with Citizen accounts can use SafePass because it relies on the same GPS location tracking software. As a result, Citizen links Citizen and SafePass user accounts, giving the company access to the information in the main Citizen user database and linked personal health information. This increases the possibility of reidentification of SafePass users and could potentially make it easier for Citizen or third parties to view or request SafePass user data through the Citizen databases.

With regards to data retention and data sharing with third parties, Citizen SafePass promises to delete only location data, Bluetooth data, and identity verification data. All other personal information (including contact information, health information, and user activity on the app) can be kept “for the period necessary to fulfill the purposes outlined in this policy and to support other app features you might use.” Broad data retention policies increase the likelihood of a data breach, like the Securus Technologies leak that exposed 70 million inmate phone calls in 2015.

In addition, while Citizen assures users that the company will require user consent to share personal information with law enforcement, the company carves out several exceptions for itself, including that it may share personal information with law enforcement to “protect, investigate and deter against fraudulent, harmful, unauthorized, unethical or illegal activity.” This could facilitate the sharing of information in response to whatever behavior the company deems “unethical” — for example, it could disclose to law enforcement the data of users who are not practicing social distancing. Taken together, these policies permit Citizen to collect large amounts of detailed, personally identifiable information while maintaining significant discretion in how that information is stored and shared.

Wellbility, an app built by New Hampshire company OnSite Medical Services, is another private Covid-19 app with poor privacy protections. Unlike most other apps, Wellbility requires users to sign up and provide personal information, including their name and date of birth, to begin to use the app. Wellbility’s Terms of Use leave the company with significant authority over what data is collected, how it is shared, and how long it is retained. The data collection section concludes by requiring users to agree to the “extensive” collection of location data, personal information, health information and interactions with other users, and it notes that mechanisms for collecting data “are not limited to means mentioned herein.” This could potentially allow Wellbility to use metadata like IP addresses to track user activity even when users are not using the app. Wellbility’s data use and sharing policies are similarly vague and broadly allow the company to disclose information “as needed to operate other related services,” for targeted advertising, and to “achieve commercial health ventures.” Finally, Wellbility stores personal data for over a year, unlike most other apps, which delete data after 14 days if unused.

In short, privately owned and managed Covid-19 apps like SafePass and Wellbility offer the lowest level of privacy protections for users, meaning that users are more likely to be vulnerable to data breaches, reidentification, or data sharing for reasons users did not consent to. By mixing profit and public health objectives, private Covid-19 apps expose users to heightened risks in the state exposure notification ecosystem.

Looking Forward

State Covid-19 apps are still in their infancy, meaning the full extent of their privacy implications is unknown. However, there are several key questions that most states have yet to sufficiently resolve, regardless of which model they have adopted.

One of the most pressing is the privacy protections in place for data that users choose to share with their state department of health. In most cases it is unclear whether this data can or will be shared with other agencies, other states, or law enforcement. Another open question is the extent to which state apps can successfully minimize the collection of data. For the GAEN apps, this generally involves reducing or eliminating the collection of metadata, such as IP addresses. Apps built on another platform present serious privacy problems because of their reliance on highly sensitive time-stamped GPS data, which is not needed to notify people of exposure to Covid-19, since Bluetooth proximity data suffices.

In the coming months, states should continue to critically evaluate their Covid-19 apps and share key data points to better inform public discussion about the apps’ privacy risks and effectiveness. These efforts should include:

  • Prioritizing transparency, including providing information about app developers and their contracts with the state, as well as disclosing whether data collected by the app will be used for purposes other than exposure notification or shared with entities other than public health officials
  • Maintaining and improving privacy safeguards, including by minimizing data collection whenever possible
  • Releasing information about the technical aspects of the app, including how the app will determine whether there has been an exposure event and what notifications users will receive
  • Regularly auditing the app’s uptake and effectiveness — for example, by evaluating its adoption and the rate of accurate exposure notifications, to justify continued use.

It is incumbent on state health departments to set forth clear privacy policies in order to foster public trust. While Covid-19 apps will necessarily involve some tradeoff between public health and user confidentiality, it is important that states work to minimize privacy harms to encourage the utilization of their apps and protect their residents. Recognizing this, states must be vigilant in auditing their apps and prioritizing transparency and public accountability.