There’s an excellent story in the anthology Futuristica Volume 1 about a police bot that’s been trained on…

There’s an excellent story in the anthology Futuristica Volume 1 about a police bot that’s been trained on historical data and shoots an innocent young black man.

Originally shared by Gina Drayer

Surprise! An AI feed a past bias (intentional or otherwise) turns out to be biased.

“Because AI systems learn to make decisions by looking at historical data they often perpetuate existing biases. In this case, that bias was the male-dominated working environment of the tech world. According to Reuters, Amazon’s program penalized applicants who attended all-women’s colleges, as well as any resumes that contained the word “women’s” (as might appear in the phrase “women’s chess club”).”

https://www.theverge.com/2018/10/10/17958784/ai-recruiting-tool-bias-amazon-report

0 thoughts on “There’s an excellent story in the anthology Futuristica Volume 1 about a police bot that’s been trained on…

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe without commenting