A group of journalists have sued the Chicago Police Department to obtain public records about a controversial algorithm the department uses to predict how likely someone is to commit a crime.
Journalists, which include the Chicago Sun-Times and an independent journalists whose previous lawsuit against CPD led to the release of the Laquan McDonald shooting video, have filed a Freedom of Information Act lawsuit against the Chicago Police Department. They’re calming that the department has withheld public information about the algorithm, which identifies citizens who land on the department’s Strategic Subject List, known as a “heat list.”
The CPD’s “heat list” is a list of hundreds of people the city has determined are likely to be involved in a crime, based on a computer algorithm. The specific variables that the algorithm takes into account are unknown, though officials told The Verge that location, whether someone has been involved in a crime, and if someone is socially connected to a known criminal or victim, are part of it. More information on the CPD’s algorithm is part of what the lawsuit hopes to uncover.
The lawsuit is also seeking information about names on the list, how names are found to run through the algorithm, and how often the algorithm is updated.
Critics wonder if predictive policing such as the CPD’s heat list algorithm is racist, and some are skeptical that it’s even effective in the first place. A RAND report found the practice did not reduce homicides, and in some cases “unnecessarily targets people for police attention.”
“Since its inception, America has thrown a grotesque percentage, comparably, of people of color into a criminal system that destabilizes the communities where those people live,” Brandon Smith, the journalist responsible for unearthing the McDonald tape and a plaintiff in the suit, told Chicago Inno over email.
“It’s my goal to find and discuss the mechanisms by which this inequality of opportunity is carried out and perpetuated. This is the journalism of democracy—discovering why some well-defined groups of people, as opposed to random folks here and there, don’t get a fair shot at health, wealth, and happiness.”
The CPD introduced the algorithm after it received a $2 million grant in 2009 from the National Institute of Justice, which awarded money to departments for predictive policing.