Algorithmic Policing
“How Cops Are Using Algorithms to Predict Crimes” – Wired, 2018, 12:29 -- https://www.youtube.com/watch?v=7lpCWxlRFAw
A look into how police departments employ computer algorithms to identify likely crime spots. While the police enjoy an increase of efficiency and claim such predictive models reduce crime, social scientists have raised concerns about biases inherent in the process. Despite algorithms claiming to be neutral and objective, a large body of research shows that they reproduce social biases and further disadvantage social minorities. For example, if most of the data on which the algorithm is based is biased (and the justice system is overflowing with data on people of color), the results of the algorithm will also be biased. We see that the likely crime areas identified by the algorithm are largely communities of color, but this is more a function of a higher arrest rate rather than a higher crime rate. Furthermore, this type of policing seems to benefit privileged criminals-- The white people dealing drugs out their suburban homes, the financial crimes committed in the central business district, and so much more appear to be sidelined through algorithmic policing. We need to develop better policies and protocols for using such technology, and perhaps acknowledging such biases is a good place to start.
From the video’s description: The LAPD is one of a growing number of police departments using algorithms to try to predict crimes before they happen. Proponents of these tools say they provide cops with added tools to keep their cities safe -- but critics argue it's just another form of profiling.