Links for 2016-10-25

  • Founder of Google X has no concept of how machine learning as policing tool risks reinforcing implicit bias

    This is shocking:

    At the end of the panel on artificial intelligence, a young black woman asked [Sebastian Thrun, CEO of the education startup Udacity, who is best known for founding Google X] whether bias in machine learning “could perpetuate structural inequality at a velocity much greater than perhaps humans can.” She offered the example of criminal justice, where “you have a machine learning tool that can identify criminals, and criminals may disproportionately be black because of other issues that have nothing to do with the intrinsic nature of these people, so the machine learns that black people are criminals, and that’s not necessarily the outcome that I think we want.” In his reply, Thrun made it sound like her concern was one about political correctness, not unconscious bias. “Statistically what the machines do pick up are patterns and sometimes we don’t like these patterns. Sometimes they’re not politically correct,” Thrun said. “When we apply machine learning methods sometimes the truth we learn really surprises us, to be honest, and I think it’s good to have a dialogue about this.”
    “the truth”! Jesus. We are fucked

    (tags: google googlex bias racism implicit-bias machine-learning ml sebastian-thrun udacity inequality policing crime)

This entry was posted in Uncategorized. Bookmark the permalink. Both comments and trackbacks are currently closed.