Skip to content

Archives

Links for 2019-01-10

  • A UK police force is dropping tricky cases on advice of an algorithm

    Wow, this is a terrible idea. It will definitely launder existing human bias into its decisions.

    However, because the technology bases its predictions on past investigations, any biases contained in those decisions may be reinforced by the algorithm. For example, if there are areas that don’t have CCTV and police frequently decided not to pursue cases there, people in those places could be disadvantaged. “When we train algorithms on the data on historical arrests or reports of crime, any biases in that data will go into the algorithm and it will learn those biases and then reinforce them,” says Joshua Loftus at Stanford University in California. […] Police forces only ever know about crimes they detect or have reported to them, but plenty of crime goes unreported, especially in communities that have less trust in the police. This means the algorithms are making predictions based on a partial picture. While this sort of bias is hard to avoid, baking it into an algorithm may make its decisions harder to hold to account compared with an officer’s. John Phillips, superintendent at Kent Police, says that for the types of crimes that EBIT is being used for, under-reporting isn’t an issue and so shouldn’t affect the tool’s effectiveness.
    ….well, I guess that’s OK then? I would have assumed under-reporting would be a massive source of bias alright….

    (tags: bias machine-learning ml ai cctv police uk kent policing)

Comments closed