Scripps Pier in La Jolla California

How is predictive policing still biased?

On Behalf of | Mar 26, 2025 | Criminal Defense |

Predictive policing is an effort to use data and computers to predict where criminal events will happen. The computer looks at a wide range of data, including things like historical arrest rates or reports of criminal activity. It can then determine where crime is most likely to happen so the police can focus on these areas.

Many opponents of these systems claim that they are biased or even racist. But how could this be true if the system is essentially being run by an unbiased computer that is just analyzing data?

Where does it get that data?

The trouble is that the computer gets data from the police department, which may have its own biases. It can then amplify these biases and make the problem worse.

For example, African Americans have an arrest rate that is twice as high as Caucasian Americans. When this data is fed into the predictive policing algorithm, it is going to focus more on African Americans. This means that their neighborhoods and communities are going to see a much higher police presence.

But why do they have a higher arrest rate? Often, it has to do with the biases of individual police officers themselves. An officer may be more likely to let a suspected Caucasian offender go free while hurrying to arrest an African American individual in the same situation. That officer’s bias is then reflected in the predictive policing model.

This shows how hard it is to get fair and unbiased policing in the United States. Those who have been arrested need to know exactly what legal options they have, especially if they believe that discrimination may have played a role.