Linear Digressions

Discriminatory Algorithms

Linear Digressions

Sometimes when we say an algorithm discriminates, we mean it can tell the difference between two types of items. But in this episode, we'll talk about another, more troublesome side to discrimination: algorithms can be... racist? Sexist? Ageist? Yes to all of the above. It's an important thing to be aware of, especially when doing people-centered data science. We'll discuss how and why this happens, and what solutions are out there (or not). Relevant Links: http://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html http://techcrunch.com/2015/08/02/machine-learning-and-human-bias-an-uneasy-pair/ http://www.sciencefriday.com/segments/why-machines-discriminate-and-how-to-fix-them/ https://medium.com/@geomblog/when-an-algorithm-isn-t-2b9fe01b9bb5#.auxqi5srz

Next Episodes

Linear Digressions

Recommendation Engines and Privacy @ Linear Digressions

📆 2016-03-28 04:46 / 00:31:33




Linear Digressions

Congress Bots and DeepDrumpf @ Linear Digressions

📆 2016-03-11 05:17 / 00:20:47


Linear Digressions

Multi - Armed Bandits @ Linear Digressions

📆 2016-03-07 03:44 / 00:11:29