Recent Proposals That Affect Government Purchases of Data
As more reports of government agencies using data brokers to purchase Americans’ personal information have surfaced, Congress has considered legislative proposals that would limit the government’s practice of circumventing the Fourth Amendment and other privacy protections by buying and accessing personal data without legal process. This section highlights two of those proposals and examines their strengths and potential shortcomings.
The first, the Fourth Amendment Is Not For Sale Act (FAINFSA), seeks to address the ECPA loophole by barring law enforcement and intelligence agencies from purchasing communications-related and geolocation data from any company that collects that information in certain ways. The second, the American Data Privacy and Protection Act (ADPPA), is a comprehensive federal consumer privacy bill that promises to reduce the amount of personal information flowing into and out of the hands of data brokers by restricting the collection of such information to only that necessary to provide a service or achieve a specific, enumerated purpose, and by placing additional limits on data transfers.
The Fourth Amendment Is Not For Sale Act
First introduced in April 2021 by Sens. Ron Wyden (D-OR) and Rand Paul (R-KY) and cosponsored by 18 other senators, FAINFSA was reintroduced this year with a companion bill in the House, cosponsored by Reps. Jerry Nadler (D-NY), Warren Davidson (R-OH), and six other lawmakers. The House Judiciary Committee voted almost unanimously (with the exception of one member who voted “present”) to report the bill out of committee in July 2023, and the bill’s language was included in the Protect Liberty and End Warrantless Surveillance Act, which was reported out of the House Judiciary Committee by a vote of 35–2 in December 2023.
FAINFSA aims to address Carpenter’s uncertain applicability to data purchases and amend ECPA to bar government agencies from purchasing certain types of information. Specifically, FAINFSA adds a provision to the SCA that prohibits government agencies from buying covered records from third parties who collect the information from specified sources: ECS or RCS providers; intermediary service providers (i.e., internet backbone companies like AT&T or Verizon that deliver, store, or process communications for or on behalf of ECS or RCS providers); online accounts with ECS or RCS providers; or electronic devices. Covered records include communications non-contents information relating to a subscriber or customer of an ECS or RCS provider, communications content, and geolocation data.
FAINSFA also prohibits the government from purchasing “illegitimately obtained information” — namely, information that third parties obtain through deceit, through unauthorized access to a device or online account, or (when obtained from ECS or RCS providers) in violation of the provider’s terms of service or privacy policies. The bill thus prevents the government from buying data from companies like Clearview AI and Voyager Labs, which scrape photos and data from social media platforms in violation of their terms of service and sell that data — or access to it through data-matching services — to government agencies. The bill applies to downstream data transfers, meaning that FAINFSA prohibits data purchases regardless of whether a third party initially obtained the information or received it from another third party. Furthermore, covered data includes so-called anonymized information that, if combined with other information, could be used to identify a person.
FAINFSA’s prohibitions could be stronger. For one, the bill only covers a limited universe of data: communications content, geolocation information, and non-contents information pertaining to a consumer or subscriber of an ECS or RCS provider. It does not purport to cover health, financial, or biometric information or other types of sensitive data. Even within the realm of communications-related or geolocation information, the bill has some holes. For instance, it does not cover communications metadata (other than geolocation information) collected by apps that do not qualify as ECS or RCS providers (e.g., health and fitness apps). And it applies only to data acquired by third parties from ECS or RCS providers, from intermediary service providers, from a person’s “online account” with an ECS or RCS provider, or from or about an electronic device. Whether these categories would cover data purchased by a third party from an entity other than an ECS or RCS provider or intermediary service provider is unclear.
In addition to these coverage limitations, FAINFSA only bans the government from obtaining data “in exchange for anything of value.” That definition includes information received “in connection with services being provided for consideration,” which would bar government entities from accessing information through data-matching services such as those that Clearview AI offers. However, if read narrowly to include only fee-based consideration or other financial compensation, FAINFSA could still leave room for third parties to voluntarily disclose or grant access to personal data to government agencies without payment. For example, Amazon’s efforts to get law enforcement to promote its Ring cameras in exchange for user data might not be covered. Companies also might disclose data without a fee in the hope of currying favor to avoid regulation or to obtain government contracts for other services.
Finally, FAINFSA would go a long way toward prohibiting the government’s purchase and use of certain highly sensitive information from data brokers. But it would not address the overcollection of data or the trafficking of personal information to other, nongovernmental third parties — practices that will likely intensify with the proliferation of AI models reliant on vast data sets. Without comprehensive limitations on companies’ collection and transfer of personal information, private actors could continue to purchase data to skew elections, harass abortion-seekers, or stalk domestic violence survivors. Foreign governments could also purchase exhaustive dossiers on American citizens for purposes of espionage recruitment or other malicious reasons.
The American Data Privacy and Protection Act
The ADPPA is, in some ways, the converse of FAINFSA. It proposes more sweeping privacy protections that would reduce the amount of personal information collected by companies at the outset and place stronger restrictions on the transfer of personal data to third parties generally. But it includes exceptions and potential loopholes that still enable government access to sensitive data without legal process.
Introduced by Rep. Frank Pallone (D-NJ) in June 2022 and cosponsored by Reps. Cathy McMorris Rodgers (R-WA), Jan Schakowsky (D-IL), and Gus Bilirakis (R-FL), the ADPPA is a comprehensive federal consumer privacy bill that establishes requirements for how companies, including telecommunications “common carriers” and nonprofits, collect, process, and transfer personal data. The ADPPA advanced out of the House Energy and Commerce Committee on a bipartisan 53–2 vote in July 2022, but it has not yet been reintroduced this Congress.
Unlike past notice-and-consent regimes, the ADPPA takes the onus of reading and accepting complicated privacy policies off the individual and instead imposes a baseline duty on companies to refrain from collecting an individual’s personal data unless they need it to provide a product or service to that individual. In practice, this model would reduce the amount of data available to data brokers and thereby limit the amount of personal information that data brokers could share with third parties, including government agencies. Not surprisingly, the data broker industry has lobbied fervently against the ADPPA.
The bill also includes prohibitions on the transfer of sensitive personal information (defined below) to third parties without the customer’s consent, and it allows customers to opt out of the transfer of nonsensitive data. However, as discussed below, the ADPPA includes certain exceptions that threaten to swallow the rule when it comes to government access to personal data.
Limitations on the Collection, Processing, and Transfer of Personal Information
The ADPPA defines personal information (or “covered data”) broadly to include inferences (or “derived data”) and any information that “identifies or is linked or reasonably linkable, alone or in combination with other information, to an individual or [an individual’s] device.” To the extent that companies seek to collect or transfer “de-identified data” (i.e., information that does not identify a distinct individual or device), the ADPPA requires companies to take “reasonable technical measures to ensure that the information cannot, at any point, be used to re-identify any individual or device,” and to “contractually obligate[]” any person or entity that receives the de-identified data to comply with that requirement.
Similarly, the bill defines “sensitive covered data” fairly broadly, and it permits the Federal Trade Commission (FTC) — the ADPPA’s primary enforcer — to add additional categories through rulemaking. Sensitive data includes precise geolocation, biometric and genetic information, health information, sexual behavior, private communications and any related metadata, race, religion, ethnicity, and union membership. Sensitive data also includes “information identifying an individual’s online activities over time and across third-party websites or online services,” which would cover cookie data or other tools that track users’ browsing or search activities across the web (and off a company’s specific website). In practice, this last prohibition would curb the practice of tracking users’ browsing activity to build a dossier that could be sold to advertisers, data brokers, and other third parties, including governments. But whether individual search queries that reveal highly sensitive information would be covered is not clear. Congress should clarify that the definition of sensitive covered data includes individual search queries that reflect or pertain to a category of information that would itself meet the bill’s definition of sensitive (e.g., searches pertaining to sexual behavior, race, religion, ethnicity, or union activity).
The ADPPA establishes baseline “data minimization” limitations on how companies collect, process, and transfer personal information. For personal information in general, covered entities may only collect, process, or transfer what is “reasonably necessary and proportionate” to deliver a product or service requested by the individual or to effectuate one of 17 specified purposes (for a complete list of these permissible purposes, see Appendix). Notably, companies may not collect personal information simply for advertising purposes; to provide advertising, companies may process or transfer only data previously collected for other purposes. In addition, individuals have the right to opt out of targeted advertising and the transfer of nonsensitive information to third parties, including through a centralized, user-friendly opt-out mechanism.
As to the collection and processing of sensitive personal information, the ADPPA is more restrictive: companies may only collect and process what is “strictly necessary” to provide a product or service requested by the individual or to effectuate one of 14 out of the 17 purposes that apply to nonsensitive information. Notably, one of the excluded purposes is targeted advertising, which has been a driving force behind the overcollection and transfer of sensitive information to data brokers.
The bill is also more restrictive on transfers of sensitive personal information to third parties. Specifically, it prohibits transfers unless an individual opts in to the transfer via “affirmative express consent” or unless one of six exceptions applies. Companies must request consent from individuals in a “clear and conspicuous standalone disclosure” with a clear description of the data subject to the request. The option to refuse consent for the collection, processing, or transfer of covered data must be “at least as prominent as the option to accept, and the option to refuse consent shall take the same number of steps or fewer as the option to accept.”
Moreover, the ADPPA mandates that third parties may not use or transfer sensitive information for any purpose other than that to which the user consented (or for one of three exceptions). In practice, those constraints would prohibit data brokers from, for instance, taking data collected to authenticate a user and selling it for advertising purposes. Taken together, these heightened restrictions would limit some of the most harmful business practices that lead to the overcollection and out-of-context secondary uses of personal data, including the sale to and use by data brokers trafficking in consumer profiles.
In addition, the ADPPA gives individuals more control over their data and imposes transparency requirements on data brokers (defined as entities whose principal source of revenue derives from processing or transferring personal information not collected directly from individuals). Specifically, the bill grants individuals the right to access, delete, correct, and move their personal information. Also, as noted, individuals have the right to opt out of (for nonsensitive information) or in to (for sensitive information) the transfer of information to third parties. And the ADPPA requires data brokers to register with the FTC and be included in a searchable, publicly available central registry, whereby individuals may “easily submit a request” to delete all data collected by those data brokers and ensure that they no longer collect that individual’s data.
In short, the ADPPA’s principles of data minimization, user data rights, and heightened restrictions regarding sensitive information would limit overcollection and secondary uses of personal information, including the sale to and use by data brokers, advertising firms, private actors, and (to some degree) government agencies. State and federal law enforcement agencies recognize this fact, noting in an opposition letter to Congress that the ADPPA would “likely complicate the private sector’s ability to continue its ongoing efforts to cooperate and voluntarily share certain information with law enforcement“ to “generate leads.”
Problematic Law Enforcement Exceptions
Nonetheless, various exceptions and gaps in the ADPPA could leave open troubling avenues for the government to obtain personal information without legal process. As a general matter, the ADPPA’s data minimization principles do not apply to government agencies, which are excluded from the law’s definition of “covered entities.” The bill also exempts service providers that collect, process, or transfer information provided by or on behalf of government entities from restrictions on nonsensitive data collection, processing, and transfer. The bill’s definition of service providers is broad enough to potentially allow extremely broad collection, processing, and transfer of nonsensitive data by data brokers to a government agency when acting under contract with that agency — restricted only by the terms of the contract. And while it generally prohibits service providers from combining the data they collect on behalf of an entity with personal data collected for other purposes, the bill does not prohibit service providers from combining data sets if necessary to effectuate 15 permissible purposes, including the law enforcement exceptions discussed below.
Additionally, the ADPPA includes several exceptions that would permit companies to collect or process personal data, and in some cases to transfer data to the government without legal process, for law enforcement-related purposes. To start, the bill allows service providers “acting at the direction of a government entity” or covered entities providing “a service” to a government entity to process and transfer personal data (including sensitive data) — and possibly to collect such data, per certain ambiguous language in Sections 101 and 102 — “to prevent, detect, protect against or respond to a public safety incident, including trespass, natural disaster, or national security incident” (albeit only “insofar as authorized by statute”). The ADPPA also permits those same entities to process and transfer personal data previously collected for other purposes in order to prevent, detect, protect against, or respond to a public safety incident.
These provisions would undo many of the bill’s promised protections, allowing law enforcement to access vast amounts of data to “generate leads” without specific, articulable, and credible facts demonstrating the existence of a public safety concern. Although the bill tries to mitigate this possibility by explicitly prohibiting “the transfer of covered data for payment or other valuable consideration to a government entity,” this language might not prohibit the voluntary transfer of personal information for nonfinancial benefits, as discussed above.
Future iterations of the ADPPA should close this loophole. First, the bill should make clear that these entities may not collect data for this public safety purpose; rather, they may only process and transfer data previously collected for other purposes. Such a limitation would bar entities from exploiting the public safety exception to justify broad collection of personal information. Companies working with the government would still be able to collect relevant data from publicly available sources, as publicly available information does not meet the ADPPA’s definition of covered data.
Furthermore, rather than presenting a nonexhaustive list of examples of public safety incidents, the ADPPA should define that term to mean criminal activity affecting public safety (which would encompass the “national security” and “trespass” examples in the current bill), natural disasters, or threats to public health. These terms would capture the public safety incidents that are of legitimate concern — such as bomb threats or other violent, criminal acts, as well as crises stemming from pandemics, earthquakes, hurricanes, floods, or wildfires — while ensuring that covered entities cannot stretch the term public safety incident to include nonviolent protests or other lawful activity. Lastly, the ADPPA should prohibit personal information transfers unless (1) the covered data directly pertains to, and could reasonably be expected to assist the government in addressing, a specific and significant threat to public safety; or (2) the government obtains the warrant, court order, or subpoena that would be required to compel production of the information.
In addition to the public safety incident exception, the ADPPA includes broad exceptions that allow the collection and processing of personal data, including sensitive data, if necessary “to prevent, detect, protect against, or respond to” security incidents (defined as network security, physical security, or life safety), fraud, harassment, or illegal activity (defined as a felony or misdemeanor “that can directly harm”). These exceptions also permit the transfer of nonsensitive personal data. Read broadly, these allowances would let companies collect masses of data under the general guise of preventing fraud or averting “security incidents.” Indeed, data brokers like RELX have historically exploited fraud prevention to justify bulk collection of personal information, which they then sell to advertisers, law enforcement agencies, and other third parties. Combined with the above exception regarding the transfer of previously collected data to law enforcement for “public safety” purposes, these permissions would grant law enforcement access to huge amounts of data without legal process.
To disincentivize such overcollection and secondary use of data, the ADPPA should prohibit the transfer of this data for payment or other valuable consideration to a third party, including a government entity (borrowing from the similar prohibition in the public safety incident exception). And the bill should make clear that personal data acquired for these purposes may not be used or transferred to third parties for other purposes, or to law enforcement, unless (1) the covered data directly pertains to, and could reasonably be expected to assist the government in addressing, a security incident, fraud, harassment, or illegal activity; or (2) the government obtains the warrant, court order, or subpoena that would be required to compel production of the information.
Finally, the ADPPA lets companies refuse an individual’s request to access, delete, correct, or move personal data if doing so would “interfere with law enforcement.” This wording, which understandably seeks to prevent criminals under investigation from deleting or changing evidence of their illegal activity, is vague and overbroad. An entity that collects information (and that has a financial incentive to maintain the information) could always posit that the information might prove useful to a future law enforcement investigation. This language also largely duplicates another, similarly overbroad ADPPA provision that permits companies to refuse an individual’s request to delete or correct information if it interferes with investigations or “reasonable efforts to guard against, detect, prevent, or investigate fraudulent, malicious, or unlawful activity.” Future proposals should replace these provisions with one allowing companies to refuse an individual’s request to access, delete, correct, or move personal data if the information that is the subject of the request reasonably appears to reflect or relate to fraud, harassment, or unlawful activity, or if there are specific, articulable facts indicating that compliance with the request would interfere with an ongoing law enforcement investigation.
The ADPPA is a promising template, but it should be strengthened. Congress should amend it to make clear that covered entities, service providers, and third parties may not voluntarily transfer personal information to law enforcement when there is no clear indication of a specific threat to public safety, a security incident, fraud, harassment, or illegal activity. Moreover, the bill currently preempts state law, which would impede states from providing stronger privacy protections against government access to data or harms arising from advancements in technologies like AI or other algorithmic models. States have so often been laboratories of democracy, creating innovative policy ideas that can be adopted at the federal level. Future proposals should ensure that this flexibility remains, perhaps by granting states a waiver to set higher standards or lifting preemption after a fixed term (e.g., five years) to allow them to address new privacy challenges.