The Code That Launched Computational Policing and Modern Racial Profiling
Article In The Thread
Los Angeles Public Library
Nov. 15, 2022
This is an excerpt from “You Are Not Expected to Understand This”: How 26 Lines of Code Changed the World, edited by Torie Bosch. Copyright © 2022 by Princeton University Press. Reprinted with permission from Princeton University Press.
In the early 1960s, the Black civil rights revolution raged in the streets across the United States. This quest to build a more racially just and equitable society happened right alongside the computer revolution. Soon the two fused with the advent of the Police Beat Algorithm (PBA), a software system to help police departments collect crime data and determine where to focus crime-fighting efforts — and one that would end up deeply affecting our society from the 1960s up through the present.
Why did the Police Beat Algorithm come to exist? What problems prompted the need for its formulation? Who developed it, and to what ends? The answers to each of these questions collectively tell a story about how a little-known computational experiment laid the cornerstone for what would become today’s surveillance infrastructure — one that has deeply and negatively affected communities of color across the globe.
In the early 1960s, IBM topped the list of the world’s leading computing companies. It innovated not only new computer hardware and systems but new ways of thinking about the computer’s role and utility in everyday society. In its 1965 annual report, IBM president Thomas J. Watson Jr. defined the computer as essentially a problem-solving tool and aligned the company’s mission accordingly.
IBM’s focus on problem-solving also dictated its marketing strategy. The company’s marketing representatives didn’t peddle prepackaged products. Rather, they engaged leaders in every major industry — from banking to transportation to the military — and simply asked, “What problem do you have?” Then, they promised to marshal IBM’s research and development strength to build customized solutions for its customers — solutions that could be broadly applied and widely scaled.
While IBM labored to market new computational solutions to social problems, uprisings materialized across the United States. In 1964 alone, so-called ghetto riots broke out in places like Harlem and Rochester in New York; Philadelphia, Pennsylvania; and Dixmoor, Illinois. These uprisings captivated the nation, as did the rampant white violence against those who marched for civil rights across the South. In a speech to Congress on March 15, 1965, President Lyndon Johnson proclaimed that America’s “Negro problem” was America’s problem. Citizens across the United States identified this fracture in “race relations” as the nation’s most pressing dilemma.
For most white Americans, however, the urban uprisings that plagued the nation revealed Black Americans’ penchant toward violence and criminality — so much so that President Johnson’s white, Southern constituents thought solving America’s crime problem should be his government’s top priority. Heeding their agitation, Johnson, on July 23, 1965, formed the President’s Commission on Law Enforcement and the Administration of Justice. The Commission’s charge was to study the causes of, and find solutions to, America’s crime problem.
Just 19 days later, one of the most deadly and costly uprisings erupted in Watts, Los Angeles. One too many incidents of police brutality at the hands of the Los Angeles Police Department set off six days of unrest. Hundreds of LAPD police officers flooded the streets. Fourteen thousand National Guard troops stormed the city. Law enforcement killed 34 Black residents and injured thousands more. More than $40 million worth of property was damaged during the siege.
Through the Watts uprisings, Black America sent a message to white America: We’re fed up. We’re tired of racism, discrimination, and police brutality. White Americans, however, saw Watts as confirmation of their prejudiced belief that Black people are lawless and violent. For the President’s Crime Commission, white America’s vision of the Watts uprisings put a face to the problem the president called on them to solve — a problem that they felt required an extraordinary remedy. They found great potential in the new computing technologies that had already revolutionized war and national defense. Computing held so much promise that in the spring of 1966, following the Watts uprisings, Johnson added the Science and Technology Task Force to the Commission to introduce new computational solutions to crime. The president justified the task force’s work by pointing to computing technology’s success in war, national defense, and space exploration:
The scientific and technological revolution that has so radically changed most of American society during the past few decades has had surprisingly little impact upon the criminal justice system. In an age when many executives in government and industry, faced with decision making problems, ask the scientific and technical community for independent suggestions on possible alternatives and for objective analyses of possible consequences of their actions, the public officials responsible for establishing and administering the criminal law . . . have almost no communication with the scientific and technical community. More than two hundred thousand scientists and engineers are helping to solve military problems, but only a handful are helping to control the crimes that injure or frighten millions of Americans each year.
While the president and the Commission held great hope for the solutions the Science and Technology Task Force would produce, they placed their hopes more specifically in the one man whom they appointed to lead it: Saul I. Gass.
Gass was a mathematician and operations research pioneer. In 1958 he wrote the first textbook on linear programming — a mathematical modeling technique that seeks to (in large part) influence human behavior by quantifying and understanding the linear relationships between variables. Gass went to work for IBM in 1960 as project manager for the company’s contract to develop the real-time computational systems needed for Project Mercury, the United States’ first manned space mission.
By 1965, when the President appointed Gass to lead the Science and Technology Task Force, Gass was managing all of IBM’s federal system projects. By heading the task force, Gass signaled his agreement with the Johnson administration that policing was the institution best equipped to solve America’s crime problem — and therefore developed — the Police Beat Algorithm.
The Police Beat Algorithm was designed to address two broad planning questions: First, how should police departments equitably divide the geographic and demographic parameters of a municipal area? (Gass focused on “urban” areas based on population, crime levels, and demographic factors.) Second, how should police departments effectively deploy police resources (people, weapons, vehicles, etc.) based on these geographical divisions?
Interestingly, Gass frequently highlighted the need to solve these problems in order to develop “contingency riot and other emergency plans” — a growing concern directly tied back to Watts and similar uprisings.
The Police Beat Algorithm predominantly addressed four problems associated with police operations: 1) pattern recognition, identifying crime patterns within a set of crime data; 2) profiling, associating crime patterns with probable suspects; 3) dragnetting, linking probable suspects of one crime with past crimes or arrests; and 4) patrol positioning, how to best place patrols within appropriate geographical divisions of the city based on where the most crimes take place and where known criminal suspect profiles predicted who will most likely commit those crimes and where.
This is where planning problems and operational problems intersected. The Police Beat Algorithm was designed to focus on patrol positioning. Doing so relied on one primary component — the availability of crime data — and two key computational techniques, norming and weighting. Norming refers to analyzing the data to determine “normal” and aberrant ranges of criminal activity, both across a geographical area and for particular groups of criminal suspects (white people versus Black people, for example). Weighting, in this instance, was a means to rank the severity of different crimes. For example, crimes like homicide, rape, burglary, larceny, and auto theft were weighted with a score of four, signifying the most severe forms of crimes. Some of the arbitrary — or dare I say biased — nature of these weights can be seen in the lack of weighted differentiation between crimes against humanity like homicide on the one hand, and property crimes like car theft on the other. Traffic accidents received a weighted score of two, and drunkenness, a score of one. Geographical areas were weighted by the preponderance of crimes committed within their boundaries. The crime data, the statistical norms, weights, and geographical configurations of a city all figured into the Police Beat Algorithm.
In one respect, the PBA was developed to address a problem that framed Black people — primarily those who were poor and lived in urban environments — as predominantly responsible for crime and, as a result, the problem that needed to be solved. The Police Beat Algorithm was therefore predetermined to geographically locate, isolate, and target Black and brown communities for police profiling, surveillance, and patrol and tactical unit distribution and deployment. All of the resulting data from these “solutions” could be used to forecast and predict where crime was most likely to happen in the future and allow police to plan accordingly. To be sure, the framing of the problem, and the configuration of the Police Beat Algorithm itself, promised outcomes that were not so much predictive of future crime as they were self-fulfilling prophesies. Gass’s PBA was essentially a proof of concept. Nevertheless, it was implemented in 1968 in the Kansas City Missouri Police Department’s new Alert II Criminal Justice Information System. It was through this system that the PBA’s racist impact was fully realized. Kansas City’s “Operation Robbery Control” was just the first example of how the algorithm led police officials to make the tactical decision to concentrate police personnel and deploy weapons on what was essentially the whole of East Kansas City, which housed the vast majority of the city’s Black citizens.
Ultimately, the Police Beat Algorithm became thousands of similar systems designed and built throughout the seventies, eighties, nineties and beyond. Over the decades, these algorithms have grown to include facial recognition, mobile surveillance, risk assessment, and other such tools used from local law enforcement to international security. The same logics and assumptions that motivated the creation of the PBA more than 50 years ago continue to permeate this array of contemporary law enforcement technologies. Fear of crime — still personified disproportionately by Black and brown people — continues to be greatly exaggerated, justifying exorbitant investment in developing more law enforcement technologies. Belief in the objective and infallible nature, and in the predictive power of data, continues to run rampant among technology purveyors, law enforcement personnel, public officials, and policy influencers. And stories about the disparate outcomes these technologies have on communities of color continue to roll in like a steady drumbeat. In these ways, today’s law enforcement technologies are not new; they’re just more sophisticated, insidious, ubiquitous, and more impactful than when the PBA was first conceived more than half a century ago.
You May Also Like
Facing Down the Long-Term Consequences of Incarceration (The Thread, 2022): New America National Fellow Reuben Jonathan Miller discusses his new book on the ways that society has been altered by mass incarceration and our impulse to punish people we’re afraid of.
Why Are We Afraid of Defunding the Police? (The Thread, 2021): Policing in America has never been a neutral institution. How might we move beyond our need to heavily rely on the police to build a safe and just society?
Follow The Thread! Subscribe to The Thread monthly newsletter to get the latest in policy, equity, and culture in your inbox the first Tuesday of each month.