With the recent officer-involved shootings in Ferguson, Baltimore, Texas, and beyond, the issue of police conduct has been grabbing headlines and launching national discussions about race, police violence, and when an officer shooting is truly justified.

But another question to examine is: Why did the shooting happen in the first place? What caused the officer to be in an adverse interaction with a civilian? And, how can we better identify when an officer is in need of an intervention or counseling before a mistake is made?

These are the questions a team of data scientists set out to answer this summer at the University of Chicago’s Data Science for Social Good fellowship. Launched three years ago by former Obama for America Chief Data Scientist Rayid Ghani, the program looks to use data to tackle civic issues. Using things like predictive modeling and machine learning, the UChicago program aims to tackle a range of issues like preventing fraud to increasing college completion rates.

As part of the White House’s Police Data Initiative, a team of four fellows, Sam Carton, Kenny Joseph, Ayesha Mahmud, and Youngsoo Park, along with technical mentor Joe Walsh, wanted to see if they could use data to prevent police brutality before it happens. The goal was to identify police officers who might be at a high risk for being involved in an adverse interaction, and then provide that officer with an intervention, training, counseling, or other help.

The team went to North Carolina to work with the Charlotte-Mecklenburg Police Department to improve the department’s Early Intervention System, which was already using some data to flag officers who were at risk of having an adverse interaction.

But the department’s current system needed improving. If an officer has three uses of force in 90 days, the system notifies a supervisor, but that was about as deep as the data analysis got. The UChicago data science fellows wanted to examine a greater range of factors to determine which officers needed help and when.

“When we went down to Charlotte, the officers said that their system wasn’t really taking into account all the factors that are important when these events happen,” fellow Ayesha Mahmud said. “There was no time of day, location, and if the officer had been working a lot prior to the event. We wanted to take a broader set of data into account.”

So the team began analyzing everything from officer activity–including traffic stops, arrests, the rate of discretionary arrests they made compared to other officers in their division–employee information like how many hours of extra duty they’ve worked, and personal information like demographics and education levels.

“The goal is to take into account this greater set of data, and maybe use some modeling that’s more complicated (than their current system),” Mahmud said. “The idea is that hopefully they would be able to target their training towards the officers who needed it the most.”

Over the course of 14 weeks, the team saw several officer indicators that stood out when predicting adverse interactions,  but Mahmud was hesitant to go into detail as the results are extremely preliminary. She noted that more departments and much more data would need to be analyzed before presenting concrete findings.

But one takeaway the team did find was that stress played a significant factor.

“One of the things (we found) was stress level-type factors,” she said. “Number of extra hours the officer worked, things like that.”

The project will continue for at least another year with the goal of bringing more police departments on board and refining the university’s model.

Mahmud acknowledged that data alone won’t be enough to prevent an officer from making a mistake in the line of duty,  but it could be one step on the intervention process and take some of the burden off of supervisors, who are largely responsible for identifying at-risk officers.

“We did show some improvement over their current system using this data,” she said. “I think even if the data can’t capture the whole story, it can play a pretty important role.”

 Image via Creative Commons