Skip Navigation
  • Home
  • Our Work
  • Research & Reports
  • Testimony on Int. 1696, Relating to the Disclosure of Source Code By Agencies Engaged in Policing and Other Services
Testimony

Testimony on Int. 1696, Relating to the Disclosure of Source Code By Agencies Engaged in Policing and Other Services

On October 16, 2017, the Brennan Center submitted testimony to the New York City Council supporting Int. 1696, which would require agencies that use algorithms or other automated processing methods that target services, impose penalties, or police persons to publish the source code used for such processing.

Published: October 16, 2017

Testimony of the Brennan Center for Justice

Rachel Levinson-Waldman

Senior Counsel, Liberty & National Security Program

before

The New York City Council Committee on Technology

regarding

Int. 1696, RELATING TO THE DISCLOSURE OF SOURCE CODE BY AGENCIES ENGAGED IN POLICING AND OTHER SERVICES

October 16, 2017

 

Dear Chairman Vacca and Members of the New York City Council’s Committee on Technology:

Thank you very much for giving me this opportunity to speak to the Committee today. My name is Rachel Levinson-Waldman, and I am Senior Counsel to the Liberty and National Security Program at the Brennan Center for Justice. I am pleased to be testifying today about the Committee’s bill to require that agencies using algorithms to engage in policing, among other services, disclose the source code for those algorithms and allow users to submit data for processing.

The Brennan Center is a nonpartisan law and policy institute that seeks to improve our systems of democracy and justice. The Liberty and National Security Program focuses on restoring the proper flow of information between the government and the people by securing increased public access to government information; ensuring government policies targeting terrorists do so effectively and without religious or ethnic profiling; and securing appropriate government oversight and accountability.

As part of that work, I filed a Freedom of Information Law (FOIL) request last year with the New York City Police Department, requesting information about the New York City Police Department’s use of predictive policing technologies.

By way of background, predictive policing involves the use of statistics or algorithms to make inferences about crime – the risk that crime is going to occur in a particular geographic area or jurisdiction, or the risk that a particular person is going to commit a crime. Predictive policing has been the subject of considerable criticism from civil rights and civil liberties advocates.[1] There have been significant concerns that predictive policing both relies on and recreates patterns of biased law enforcement, simply sending officers back to neighborhoods that are already overpoliced.[2] In addition, there is little proof that predictive policing is actually effective in predicting and reducing crime.[3] One phrase often used is that predictive policing predicts policing – it does not predict crime.[4]

In light of these concerns, transparency regarding the code that provides the foundation for predictive policing is paramount.[5] According to publicly available documents that we reviewed in preparation for our FOIL request, the NYPD expected to spend about $45 million on predictive policing technologies over the course of five years.[6] However, there was little information publicly available about how the department intended to use the technologies, what information would be inputted, and how the community would be affected, among other questions. Without more information, we were concerned that the police department’s use of predictive policing was occurring in the dark, with little information available to the most affected communities about how policing decisions were being made or opportunity for those communities to make their concerns known.

We therefore filed a FOIL request last July for a range of documents, including information about how the NYPD’s predictive policing program was using the data put into it, and the specific algorithms in use. The NYPD rejected our request in a one-page letter, providing no records in response. We appealed, and the department denied our appeal, again disclosing no records or other information about their predictive policing program.

In December of 2016, we sued; in our lawsuit, we emphasized the important interests in transparency that FOIL embodies, much as this legislation does as well.[7] Almost immediately after we filed suit, the NYPD disclosed a number of documents – but they refused to disclose the source code for their predictive policing algorithm, and have continued to refuse to disclose a range of other important information.[8] As a result, there is still far too little known about the practical mechanics of this policing practice.

It is worth noting that the NYPD has expressed concerns about making the source code for it predictive policing program publicly known; the department has argued that with the source code in hand, criminals could learn where police officers will be patrolling and evade detection. We believe – as we have told the NYPD and the judge hearing our case – that this risk is remote. Predictive policing programs generally identify limited areas where officers are directed to spend some fraction of each shift; they do not direct or reveal the location of each officer at every moment, and they are extremely unlikely to provide a detailed roadmap to the curious criminal. 

On the flip side, as detailed above, the public benefits to understanding the workings of this program are significant. The NYPD has touted itself as being the most transparent police department in the world.[9] In fact, as our experience shows, the NYPD has frequently resisted transparency, requiring groups like the Brennan Center and journalists to expend significant resources in trying to extract information of critical interest to the public.[10] Similarly, little was known about the department’s multi-year contract with the data analytics giant Palantir – which was apparently crunching information about arrest records, license-plate reads, parking tickets, and more – until a Buzzfeed article came out this past June.[11] This is why the Brennan Center also supports the POST Act, a bill co-sponsored by Council Members Garodnick and Gibson, which would require the NYPD to publicly report on the surveillance tools it uses and the rules for using them.[12]

In sum, this bill, Int. 1696, would be a groundbreaking measure and a significant step forward in transparency, and would significantly contribute to the NYPD’s program of community engagement.[13] The Brennan Center strongly supports its passage. I would be happy to answer any questions or to provide any additional information.


[1] See, e.g., Leadership Conference on Civil and Human Rights, et al., Predictive Policing Today: A Shared Statement of Civil Rights Concerns (Aug. 31, 2016), available at  http://civilrightsdocs.info/pdf/FINAL_JointStatementPredictivePolicing.pdf.

[2] See, e.g., Jack Smith IV, Crime-prediction Tool PredPol Amplifies Racially Biased Policing, Study Shows, Mic (Oct. 9, 2016), https://mic.com/articles/156286/crime-predictiontool-pred-pol-only-amplifies-racially-biased-policing-study-shows (last visited Oct. 15, 2017); see also Laura Nahmias, NYPD Testing Crime-Forecast Software, Politico (July 8, 2015, 5:52 AM EDT), http://www.politico.com/states/new-york/city-hall/story/2015/07/nypd-testing-crime-forecast-software-090820 (quoting maker of predictive policing software as noting the importance of assessing “how we apply statistics and data in a way that’s going to be sensitive to civil rights and surveillance and privacy concerns”).

[3] See, e.g., William J. Hayes, Naval Postgraduate Sch., Case Studies of Predictive Analysis Applications in Law Enforcement (Dec. 2015), available at https://www.hsdl.org/?view&did=790324; Martin Maximino, The Effectiveness of Predictive Policing: Lessons From A Randomized Controlled Trial, Journalist Res. (last updated Nov. 6, 2014), https://journalistsresource.org/studies/government/criminal-justice/predictive-policing-randomized-controlled-trial; Matt Stroud, Chicago’s Predictive Policing Tool Just Failed A Major Test (Aug. 19, 2016, 10:28 AM EDT), https://www.theverge.com/2016/8/19/12552384/chicago-heat-list-tool-failed-rand-test.

[4] See Ezekiel Edwards, Predictive Policing Software Is More Accurate At Predicting Policing Than Predicting Crime, Huffpost (Aug. 31, 2016, 2:58 EDT), https://www.huffingtonpost.com/entry/predictive-policing-reform_us_57c6ffe0e4b0e60d31dc9120.

[5] See David Black, Here Comes Predictive Policing: The Next Wave of Crimefighting Technology Is Being Tested In New York City, N.Y. Daily News (Jan. 24, 2016), http://www.nydailynews.com/opinion/david-black-predictive-policing-article-1.2506580 (last visited Oct. 15, 2017) (“Most important, the use of predictive policing technologies must be transparent — and carefully overseen by vigilant citizens themselves.”).

[6] See, e.g., City of N.Y., Developing the NYPD’s Information Technology 6–7, available at http://home.nyc.gov/html/nypd/html/home/POA/pdf/Technology.pdf (last visited Oct. 16, 2017); Mayor de Blasio, Police Commissioner Bratton Announce CompStat 2.0, City of N.Y. (Feb. 23, 2016), http://www1.nyc.gov/office-of-the-mayor/news/199–16/transcript-mayor-deblasio-police-commissioner-bratton-compstat-2–0#/0 (last visited Oct. 16, 2017).

[7] Brennan Center for Justice v. New York Police Department, Brennan Ctr. For Justice (May 19, 2017), https://www.brennancenter.org/legal-work/brennan-center-justice-v-new-york-police-department (linking to the Brennan Center’s FOIA request and appeal; the NYPD’s denial of the request and denial of appeal; and the legal documents filed in the litigation). 

[8] See Rachel Levinson-Waldman & Erica Posey, Predictive Policing Goes to Court, Brennan Ctr. For Justice (Sept. 5, 2017), https://www.brennancenter.org/blog/predictive-policing-goes-court (While the Brennan Center narrowed its request to exclude the source code as a show of good faith, and to hasten the production of the other records requested, it did not concede that the source code is exempt from disclosure.).

[9] See JPat Brown, Five Examples of the NYPD’s Commitment to “Transparency,” Muckrock (June 14, 2017), https://www.muckrock.com/news/archives/2017/jun/14/five-examples-nypd-transparency/.

[10] See, e.g., Adam Klasfield, Sound-Cannon Case Heralds E-Transparency for NYPD, Courthouse News (June 30, 2017), https://www.courthousenews.com/sound-cannons-case-heralds-e-transparency-nypd/; Brown, supra note 9.

[11] Emily Hockett & Michael Price, Palantir Contract Dispute Exposes NYPD’s Lack of Transparency, Just Security (July 20, 2017, 1:43 PM), https://www.justsecurity.org/43397/palantir-contract-dispute-exposes-nypds-lack-transparency/.

[12] For more on the POST Act, short for Public Oversight of Surveillance Technology Act, see Michael Price, Margot Adams, & Lamya Agarwala, POST Act Hearing Round-Up, Brennan Ctr. For Justice (June 21, 2017), https://www.brennancenter.org/blog/post-act-hearing-round-0; The Public Oversight of Surveillance Technology (POST) Act: A Resource Page, Brennan Ctr. For Justice (June 12, 2017), https://www.brennancenter.org/analysis/public-oversight-police-technology-post-act-resource-page.

[13] See, e.g., William J. Bratton, The NYPD Plan of Action and the Neighborhood Policing Plan: A Realistic Framework for Connecting Police and Communities (NYPD 2015), available at http://home.nyc.gov/html/nypd/html/home/POA/pdf/Plan-of-Action.pdf

Testimony on Int. 1696, Relating to the Disclosure of Source Code By Agencies Engaged in Policing and Other… by The Brennan Center for Justice on Scribd