Skip Navigation
Expert Brief

When Police Surveillance Meets the ‘Internet of Things’

Doorbell cameras, smart thermostats, digital assistants, and other always-on devices open up a whole new world of privacy risks when the government has access to their data.

Published: December 16, 2020

Connec­ted devices add eyes and ears around our homes, inside our cars, and on our bodies. Through sensors, cameras, micro­phones, and other tech­no­logy, they are constantly collect­ing data and send­ing it to company serv­ers for analysis and stor­age. Police have already come call­ing for this inform­a­tion gener­ated by the so-called “inter­net of things,” with signi­fic­ant privacy implic­a­tions for Amer­ic­ans.

Google Nest’s Hello Door­bell uses facial recog­ni­tion to tag and store a roster of “famil­iar faces” that frequently pass within the camer­a’s range. Amazon Alex­a’s micro­phones can capture private conver­sa­tions inside homes and cars. Wear­ables like Fitbit can track a person’s move­ments and vital signs.

The deep well of data these devices create allows law enforce­ment to analyze a person’s prox­im­ity to a crime, assess rela­tion­ships between victims and suspects, and even review record­ings of incrim­in­at­ing state­ments. With the decreas­ing cost of data stor­age, inform­a­tion collec­ted by the inter­net of things can often be retained indef­in­itely, a prac­tice encour­aged by a busi­ness model that relies on user data to improve and develop new products.

For example, a Cali­for­nia man was charged with the murder of his step-daugh­ter after her Fitbit revealed that the woman’s heart rate signi­fic­antly spiked, then quickly dropped before stop­ping, all around the time that a neigh­bor’s Ring surveil­lance camera showed the man’s car parked at her house.

At the same time, the Consti­tu­tion places limits on law enforce­ment access and over­reach that must be balanced against the govern­ment’s conveni­ence. In a world where people are subjec­ted to ongo­ing surveil­lance by public and private actors, there is an urgent need to update regu­lat­ory frame­works and rethink privacy protec­tions to account for the ines­cap­able role of tech­no­logy compan­ies in every­day life.

Privacy and civil rights concerns

Connec­ted devices raise seri­ous privacy concerns, as they can reveal sens­it­ive inform­a­tion about people’s homes, move­ments, and inter­ac­tions with others. As of 2019, nearly 70 percent of Amer­ican house­holds had at least one such gadget. From connec­ted ther­mo­stats to digital assist­ants, these devices collect inform­a­tion from inside the home — a space expli­citly safe­guarded by the Fourth Amend­ment and protec­ted by courts against both tech­no­lo­gical and phys­ical intru­sions.

Cameras inside a home can capture people’s images and move­ments, digital assist­ants can capture private conver­sa­tions, and connec­ted ther­mo­stats can track when people enter and leave vari­ous rooms. Other devices, such as activ­ity track­ers and in-car navig­a­tion systems, create detailed records of people’s move­ments over long peri­ods of time. These records can reveal not only the loca­tion of a person’s home or work­place, but also their asso­ci­ations and their parti­cip­a­tion in consti­tu­tion­ally protec­ted activ­it­ies such as prayer and protest.

Even devices that monitor public spaces can raise privacy concerns. For example, compan­ies that sell door­bell and outdoor cameras offer users the abil­ity to create “activ­ity zones” to monitor pre-set areas for “famil­iar faces” or for certain move­ments. These cameras often monitor outdoor spaces such as drive­ways or side­walks. Whenever these cameras detect a person or move­ment, they capture audio and video foot­age and send an alert to the device owner. Record­ings of the detec­ted activ­ity (some­times referred to as “events”) can be down­loaded and shared with police and to social media at the push of a button.

The prolif­er­a­tion of connec­ted devices provides expans­ive oppor­tun­it­ies for the govern­ment to assemble detailed portraits of people’s lives. Many compan­ies offer entire suites of connec­ted devices: Google sells everything from connec­ted cameras to ther­mo­stats and activ­ity track­ers to digital assist­ants, and even offers private secur­ity monit­or­ing through its part­ner­ship with Brinks. Police can further augment data from connec­ted devices with data collec­ted by their own substan­tial arsen­als of surveil­lance tools. This type of compre­hens­ive track­ing would have been unima­gin­able before the digital age and elim­in­ates prac­tical limits on surveil­lance, such as the expense of alloc­at­ing person­nel to engage in 24/7 monit­or­ing.

Finally, it is not always possible to opt out of this type of surveil­lance. A land­lord may unilat­er­ally install facial recog­ni­tion cameras or ther­mo­stats, or a homeown­ers’ asso­ci­ation may install license plate read­ers with little oppor­tun­ity for domestic or service work­ers to object to the prac­tice or request that their data be deleted. At least seven states have passed laws requir­ing util­ity compan­ies to allow consumers to opt out of smart-meter install­a­tion, high­light­ing the import­ance of — and barri­ers to — enabling indi­vidual choice.

The prolif­er­a­tion of connec­ted devices partic­u­larly threatens the civil rights of communit­ies of color. Tools such as facial recog­ni­tionspeech recog­ni­tion, and emotion detec­tion have docu­mented racial biases that limit these tech­no­lo­gies’ abil­ity to accur­ately identify and under­stand communit­ies of color. These biases are driv­ing nation­wide conver­sa­tions about whether such tools should be used by police. But even in juris­dic­tions with bans on police use, prohib­i­tions may not always reach private actors, and private entit­ies like homeown­er’s asso­ci­ations, for instance, have a long history of racial discrim­in­a­tion.

While police depart­ments across the coun­try are under scru­tiny not just for over-poli­cing but for using surveil­lance tech­no­logy to target communit­ies of color, connec­ted devices provide new ways to obscure those prac­tices by obtain­ing data from cooper­at­ive land­lords or employ­ers instead of having to comply with trans­par­ency and account­ab­il­ity controls begin­ning to take root around the coun­try.

Legal frame­works for access to data from connec­ted devices

Consti­tu­tional protec­tions

The Fourth Amend­ment protects “persons, houses, papers, and effects” from unreas­on­able searches and seizures. Over time, the Supreme Court’s applic­a­tion of the Fourth Amend­ment has evolved in response to “innov­a­tions in surveil­lance tools.” For example, the Court has ruled that law enforce­ment must obtain a warrant before search­ing a suspect’s cell phone during an arrest, installing a GPS tracker on an auto­mobile for long-term monit­or­ing, or obtain­ing histor­ical cell­phone loca­tion inform­a­tion.

But there are limit­a­tions. Under current doctrine, the Fourth Amend­ment does not protect some types of inform­a­tion that people know­ingly or unknow­ingly expose to the public. Thus, the Supreme Court has ruled that — contrary to empir­ical research on indi­vidu­als’ expect­a­tions of privacy — the govern­ment can pick out sens­it­ive mater­i­als from an indi­vidu­al’s trash and conduct aerial surveil­lance over a back yard without a warrant.

Simil­arly, the Court takes the view that in some situ­ations, people do not have a reas­on­able expect­a­tion of privacy in inform­a­tion they provide to others, includ­ing compan­ies with which they do busi­ness. Commonly referred to as the third-party doctrine, this rule has histor­ic­ally allowed police to obtain data such as bank records or a log of dialed tele­phone numbers without a warrant. However, the Court recently signaled in Carpenter v. United States that this doctrine loses its force where the reveal­ing nature of the inform­a­tion sought bolsters the indi­vidu­al’s privacy interests in the data shared with third parties, and where disclos­ure is not truly volun­tary because the use of a tech­no­logy such as a cell­phone is “indis­pens­able to parti­cip­a­tion in modern soci­ety.”

The Supreme Court has not yet ruled on how the Fourth Amend­ment applies to data from every connec­ted device, but its guid­ance in Carpenter will be instruct­ive. The Fourth Amend­ment’s applic­ab­il­ity to a partic­u­lar device may ulti­mately depend on a vari­ety of factors, such as where a device is located, the intim­acy and compre­hens­ive­ness of data that is retained and retro­act­ively search­able by law enforce­ment, and a person’s abil­ity to avoid having the device collect their data.

While fitness track­ers or a car’s GPS system may enable data collec­tion that is “detailed, encyc­lo­pedic, and effort­lessly compiled” — one of the factors the Court considered — connec­ted cameras or a digital assist­ant might collect more limited records, depend­ing on their use. On the other hand, a digital assist­ant or indoor camer­a’s pres­ence inside a home means that it is at the heart of the Fourth Amend­ment’s protec­tions. Simil­arly, a network of license plate read­ers or door­bell cameras may make it func­tion­ally diffi­cult for a person to avoid having their move­ments compre­hens­ively tracked and retro­act­ively search­able by law enforce­ment.

Stat­utory protec­tions

Legal protec­tions do not start and end with the Fourth Amend­ment. Govern­ment access to inform­a­tion collec­ted by tech compan­ies is also limited by stat­ute.

Specific­ally, the Stored Commu­nic­a­tions Act estab­lishes a process that the govern­ment must follow when it seeks certain types of elec­tron­ic­ally stored data. When it comes to the content of commu­nic­a­tions, the law requires differ­ing levels of judi­cial over­sight, oblig­at­ing police to obtain a warrant, court order, or subpoena. Where the data sought does not relate to the content of a commu­nic­a­tion — for example, inform­a­tion about when and to whom a message was sent — there are gener­ally fewer oblig­a­tions on the govern­ment. The law also allows compan­ies to volun­tar­ily disclose inform­a­tion to law enforce­ment in an emer­gency or when the service provider’s own prop­erty is in danger.

However, the Stored Commu­nic­a­tions Act regu­lates a subset of service providers, and the data collec­ted by connec­ted devices may not always be covered by the law. While connec­ted devices may collect sens­it­ive inform­a­tion, the way in which that inform­a­tion is collec­ted and the reas­ons why it is stored may place it outside the scope of the law.

Simil­arly, the Stored Commu­nic­a­tions Act may not regu­late compan­ies that store elec­tronic commu­nic­a­tions for their own purposes. This likely includes connec­ted devices, which collect and store data for a number of reas­ons, from improv­ing products to user custom­iz­a­tion.

If police want to obtain real-time commu­nic­a­tions from connec­ted devices, they will typic­ally need to obtain a special wiretap warrant. In addi­tion to estab­lish­ing prob­able cause, this warrant requires a number of more strin­gent proced­ures to ensure that indi­vidu­als’ commu­nic­a­tions are collec­ted only where strictly neces­sary. In theory, police could obtain a wiretap order to inter­cept commu­nic­a­tions collec­ted by a digital assist­ant inside a car or a home. There may be prac­tical limit­a­tions, however, as there is no oblig­a­tion for tech­no­logy compan­ies to build the infra­struc­ture that permits real-time wiretap­ping.

Finally, state laws may place addi­tional restric­tions on the abil­ity to collect data from connec­ted devices in the first place. For example, many states require both parties to a conver­sa­tion to consent to record­ing, which may oblig­ate homeown­ers and compan­ies alike to take steps to obtain the expli­cit consent of guests whose conver­sa­tions may be captured by tools like a digital assist­ant. Simil­arly, Illinois requires compan­ies require expli­cit consent before captur­ing and stor­ing biomet­ric iden­ti­fi­ers such as a photo of a person’s face. Google Nest’s privacy state­ment nods to this require­ment without expli­citly telling custom­ers about the law. Instead, it tells them that “[d]epend­ing on where you live . . . you may need to get expli­cit consent to scan the faces of people visit­ing your home.”

Privacy policies

Short of situ­ations where the law clearly requires the govern­ment to obtain a warrant or follow another legal process, corpor­ate decisions to disclose data to law enforce­ment will in most circum­stances be governed by their privacy policies.

The language in these policies typic­ally follows a stand­ard form. For example, the Ring privacy policy allows Amazon to disclose inform­a­tion about its users where required by law, to defend Amazon’s own legal rights, to prevent harm, in connec­tion with a crim­inal invest­ig­a­tion, or with the user’s consent. Similar language is found in the policies for Google NestSimpliSafe, and others.

Privacy policies give compan­ies consid­er­able leeway to disclose user data to the govern­ment without follow­ing any sort of legal process. In those situ­ations, the decisions to volun­tar­ily share data with law enforce­ment, to notify users, and to disclose the data shar­ing in a trans­par­ency report largely remain matters of company discre­tion.

Many compan­ies also main­tain separ­ate law enforce­ment policies in which they provide more detailed instruc­tions for law enforce­ment seek­ing access to user data. For example, Ring requires a search warrant or user consent to disclose content inform­a­tion, but may disclose non-content data such as subscriber inform­a­tion, purchase history, and service usage with a “subpoena, search warrant, or other court order.”

Law enforce­ment access to connec­ted devices in prac­tice

It is increas­ingly common for law enforce­ment officers to collect data from connec­ted devices as part of an invest­ig­a­tion, and not always under judi­cial super­vi­sion. Asking a person to volun­tar­ily provide data elim­in­ates the need to follow a legal process. Compan­ies like Amazon further simplify the process by build­ing a portal by which law enforce­ment can request Ring data from Amazon’s custom­ers. Users can disable the request feature, but they are auto­mat­ic­ally opted in by default.

In New Hamp­shire, for instance, a man was accused of shoot­ing his brother in the arm in a dispute in a drive­way. Foot­age of the alter­ca­tion was captured by several Ring cameras owned by neigh­bors, who provided their foot­age to police. A judge allowed audio of the incid­ent to come in, ruling that the defend­ant should have expec­ted the drive­way commu­nic­a­tions between himself and his brother to be publicly exposed.

Connec­ted devices have also been used to contra­dict an account of events. A Pennsylvania woman who alleged she was raped was later charged with making false state­ments and tamper­ing with evid­ence after Fitbit data she volun­tar­ily provided to police sugges­ted she had been moving around her home during the time she claimed to be asleep.

In other instances, police have obtained warrants to access data from connec­ted devices. In Arkan­sas, unusual water usage tracked by a smart meter was used to substan­ti­ate claims that a defend­ant attemp­ted to clean up a murder scene. Prosec­utors also sought to obtain a warrant for record­ings of the defend­ant’s Amazon Echo, but the man even­tu­ally volun­tar­ily disclosed the record­ings.

And in the case of a Connecti­cut man currently facing trial for the murder of his wife, police have obtained warrants for the victim’s Fitbit and several other connec­ted devices through­out the house that revealed move­ments and other inform­a­tion contra­dict­ing his account of events.

Lack of trans­par­ency from tech compan­ies

Despite the prolif­er­a­tion of connec­ted devices and law enforce­ment’s appet­ite for the data, many compan­ies still do not publish public trans­par­ency reports. While compan­ies like Amazon and Google publish reports reveal­ing disclos­ures of user data across their entire suite of products, the reports do not break down the number of law enforce­ment requests by product or specify the type of data provided. This makes it diffi­cult to under­stand how often police are request­ing data from, for instance, a connec­ted door­bell rather than an email account. Compan­ies like eufy, Pion­eer, SimpliSafe, BMW, Subaru, and OnStar do not publish trans­par­ency reports at all.

Despite aggreg­ated numbers, trans­par­ency reports uniformly show a signi­fic­ant increase in law enforce­ment requests for data. Amazon reports reveal a 264 percent increase in U.S. law enforce­ment requests from its first trans­par­ency report in 2015 through June of 2020. Google’s reports reveal a 109 percent increase in requests from the second half of 2015 to the second half of 2019.

Moving forward

Efforts to address police surveil­lance must seek trans­par­ency and over­sight regard­ing law enforce­ment’s abil­ity to lever­age connec­ted devices and private surveil­lance systems. Across the coun­try, cities and muni­cip­al­it­ies are already rein­ing in unac­count­able police surveil­lance, with some even banning the use of certain tech­no­lo­gies like facial recog­ni­tion.

At the same time, the rise in private data collec­tion intro­duces new complex­it­ies. While the major­ity of connec­ted devices are not part of a cent­ral network, it is possible that homeown­ers could be asked to connect their foot­age in real time to a police center.

This even­tu­al­ity is already under­way in Jack­son, Missis­sippi, where the city plans to launch a pilot program allow­ing homeown­ers and busi­nesses to pipe their camera streams directly into the city’s Real Time Crime Center. This pilot program was announced less than three months after Jack­son became the first south­ern city to ban police use of facial recog­ni­tion tech­no­logy. And in San Fran­cisco, the police depart­ment obtained real-time access to a private surveil­lance network oper­ated by a busi­ness district that it then used to monitor racial justice protest­ers this past summer.

We are living through a time of signi­fic­ant tech­no­lo­gical shift, magni­fied by a prolif­er­a­tion of connec­ted devices that collect and store snap­shots of people’s lives, often without genu­ine consent or buy-in by those being surveilled. In a world where so many aspects of our lives are medi­ated through third party service providers, there is an increas­ing need to reas­sess the adequacy of consti­tu­tional and stat­utory protec­tions.

By rely­ing on private connec­ted devices, police are able to achieve increas­ingly compre­hens­ive views into where people go and what they say. Not only is this incom­pat­ible with the right to privacy, it can have a chilling effect on other consti­tu­tional rights like the right to free expres­sion, as omni­present govern­ment eyes and ears can make people less comfort­able with express­ing contro­ver­sial thoughts and beliefs — even within the privacy of a home. These expans­ive new abil­it­ies are also ripe for abuse in light of our nation’s long history of illegal domestic spying, from J. Edgar Hoover’s FBI to the post-9/11 NSA. Moreover, for communit­ies of color, the dangers of discrim­in­at­ory target­ing are present whether they are parti­cip­at­ing in racial justice protests or simply living in a neigh­bor­hood targeted by anti-gang poli­cing.

While trans­par­ency and over­sight laws for police surveil­lance are import­ant and common­sense first steps, there is an ongo­ing need for new regu­la­tions that mean­ing­fully guard indi­vidual rights and freedoms in the digital age. We will be explor­ing what those protec­tions might look like in our coming work.