Six Concerns About the Proliferation of Predictive Policing Systems

Blog Post
Shutterstock
Aug. 31, 2016

Predictive policing systems are no longer reserved for movies like Minority Report. They are proliferating throughout the US with almost no review or audit. This unchecked expansion underscores the desperate need for reform to US policing behavior more broadly, and highlights the challenges presented by evolving technology.

Predictive policing systems can be used for good or to perpetuate injustice. They generally use data to make decisions about where police should focus resources. Unfortunately, the data that feeds into such systems is typically biased because the data primarily documents law enforcement’s response to reported crimes or incidents, not a complete record of all crimes that occur. Until police departments and vendors begin to seriously rethink the way these systems are implemented, the use of the technology will only exacerbate police bias against communities of color.

Today, OTI joined 16 other civil rights, privacy, and technology organizations on a statement identifying six concerns that police departments should consider as they implement predictive policing systems:

  1. A lack of transparency about predictive policing systems prevents a meaningful, well-informed public debate;

    The public and other stakeholders should understand what data is being used, what the system intends to predict, the design of the algorithm, how predictions will be used in practice, and what relevant factors are not being measured or analyzed. Disclosing this information will enable robust public discussion where there currently is little. Of course, transparency on its own is not enough — other safeguards are needed.

  2. Predictive policing systems ignore community needs;

    Most predictive policing systems focus narrowly on reported crime rate but ignore other important policing goals such as building community trust, eliminating use of excessive force, and reducing coercive police tactics. This requires measuring and tracking the use of coercive tactics and the demographics of the people involved.

  3. Predictive policing systems threaten to undermine the constitutional rights of individuals;

    The Fourth Amendment protects people against being stopped by the police without reasonable suspicion--in other words, a stop requires more than a hunch. A “hunch” that comes from a computer is no different. These systems also should not erode due process or equal protection rights by manufacturing unexplained “threat” assessments.

  4. Predictive technologies are primarily being used to intensify enforcement, rather than to meet human needs;

    Police enforcement is merely one way to address crime. Social services such as educational opportunities or job training can also be effective. Police departments should find ways to use predictive technologies to better allocate social services resources.

  5. Police could use predictive tools to anticipate which officers might engage in misconduct, but most departments have not done so;

    Evidence shows that police misconduct follows consistent patterns. Offering further training and support to those officers can help avert problems. Police should pilot new programs designed to identify such patterns and attempt to prevent future police misconduct.

  6.  Predictive policing systems are failing to monitor their racial impact;

    Systems being deployed must be publicly audited and monitored for their disparate impact on different communities, with results broken out by race and neighborhood. Those disparities must be addressed.

Police departments around the country should consider these concerns as they implement very powerful — but potentially oppressive and biased — technology designed to aid policing. These concerns will help ensure this technology is implemented for good, not as a way to perpetuate injustice.

Predictive policing is only one of many civil rights issues evolving with the rise in policing technologies. For more on OTI’s work in this area, see our FCC complaint against the Baltimore Police Department’s illegal use of cell-site simulators, and our work on police body-worn cameras. OTI has also been involved in civil rights issues outside of policing technologies, including the Civil Rights Principles for the Era of Big Data, which formed the basis of a recent letter to FCC Chairman Wheeler urging him to consider the Civil Rights Principles in adopting privacy rules for Internet service providers.