There’s an excellent story in the anthology Futuristica Volume 1 about a police bot that’s been trained on historical data and shoots an innocent young black man.
Originally shared by Gina Drayer
Surprise! An AI feed a past bias (intentional or otherwise) turns out to be biased.
“Because AI systems learn to make decisions by looking at historical data they often perpetuate existing biases. In this case, that bias was the male-dominated working environment of the tech world. According to Reuters, Amazon’s program penalized applicants who attended all-women’s colleges, as well as any resumes that contained the word “women’s” (as might appear in the phrase “women’s chess club”).”
Yeah, you cant really feed an AI thousands of data points and then say “but ignore all our past mistakes.”
Yeah, you cant really feed an AI thousands of data points and then say “but ignore all our past mistakes.”