Protesters in Detroit are demanding that the city end its contract with a provider of facial recognition software that studies have shown has trouble identifying Black faces.
Nationwide civil unrest following the police killing of George Floyd has renewed scrutiny of policing — including the growing use of artificial intelligence in law enforcement. This month, Amazon, IBM and Microsoft said they would scale back their use of facial recognition software, but activists say the problem of bias in artificial intelligence goes much further.
Here & Now‘s Jeremy Hobson speaks with Ina Fried, chief technology correspondent at Axios.
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.