Member-only story
Police Are Trying to Predict the Future
‘Predictive policing’ uses historic data to forecast where crime will occur, but critics question the accuracy

How can policing be improved? Politicians, police stations, activist groups, and the public all have their own ideas on how to meet that challenge. Yet while social media and smartphones have increased the awareness of police brutality, an entirely different technology has been steadily and covertly expanding across public life.
PredPol is just one of a number of predictive policing companies based on artificial intelligence. PredPol’s software analyzes crime data to determine which parts of a city are more likely to see criminal behavior and helping police departments “predict” crime before it happens.
Based on PredPol’s algorithmic recommendations, police patrol certain areas with more vigilance. The company claims that one in 33 Americans are protected by its data, implying that it is used by forces across the U.S.; OneZero asked PredPol to clarify this claim and to comment on this article, but received no response. PredPol uses confidential contracts, making it difficult to find out exactly which cities use the technology, but documents seen by Vice earlier this year show that the software has been used for years by police departments in Los Angeles, Santa Cruz, Seattle, Atlanta, Tacoma, Little Rock, and many others (including, somewhat surprisingly, the University of California, Berkeley.) Secretive contracts mean these systems have largely avoided public scrutiny until recently, yet it’s likely that more of us are coming under the watchful eyes of companies like PredPol, Hunchlab, and others, either as criminal offenders within their systems or via police presence in our neighborhoods.
This type of “smart policing” makes the assumption that using machine learning will increase police efficiency and organizational processing, and is, therefore, key to their mandate to protect and to serve. Andrew Ferguson, a law professor at the University of the District of Columbia, told OneZero what this looks like in practice: “[Each company] is doing the same basic work, which is trying to find inputs that police think correlate with certain risk factors, and trying to either put the police car in the right…