“Predictive Policing” computer systems — which use data to forecast where crime will happen or who will be involved — provide a “misleading and undeserved imprimatur of impartiality,” and rely on profoundly limited and biased data, according to a shared statement of civil rights concerns issued today by the Brennan Center for Justice at NYU School of Law and sixteen other civil liberties groups.
The statement identifies six major issues with predictive policing programs as currently implemented: their pervasive secrecy prevents public debate; they ignore community needs; they intensify enforcement rather than adapt to local conditions; they threaten to undermine constitutional rights; they fail to monitor racial impact; and they are not used to assess officer misconduct even though they could be used to do so.
“While data-driven policing holds promise, the way ‘predictive policing’ is being implemented today threatens to exacerbate rather than alleviate disparities and inefficiencies in how communities are policed,” said Rachel Levinson-Waldman, senior counsel at the Brennan Center. “Because the data fed into these systems is itself the function of historical police behavior, because the data and algorithms are often kept secret from the public, and because policies to ensure accountability and transparency are the exception rather than the norm, predictive policing in its current state does not live up to its billing.”
Click here to read the full statement.
Read more about the Brennan Center’s work on Liberty & National Security.
Brennan Center expert Rachel Levinson-Waldman is available for comment. To schedule an interview, contact Naren Daniel at (646) 292–8381 or naren.daniel@nyu.edu.