2 Comments

The point about the source material that is used to train an AI is on point. A few years ago (after I left the place), Amazon admitted that its recruiting system based on machine learning was biased towards males it) were males. This Reuters article is a good explanation of what and how: https://www.reuters.com/article/idUSKCN1MK0AG

Expand full comment
author

Thanks, Annie! Kate Crawford mentions Amazon's HR issues with AI in her book Atlas of AI. I'd forgotten you worked there for a while. Gender bias shows up in so many ways with AI output, as does racial bias. And because the bias is baked into the training data, it's going to be very difficult to remedy this weakness.

Expand full comment