Uncovering discrimination in machine-learning software -- GCN
It's no secret that machine learning-based algorithms can be problematic and even inject or boost bias in decision-making processes. Software used by courts for sentencing decisions has been shown to make harsher recommendations for defendants of color. Alexandra Meliou and Yuriy Brun, both assistant professors at the University of Massachusetts at Amherst, have found a new technique to automatically test software for discrimination.
Machine Learning, Artificial Intelligence, and Deep Learning News around the world. We publish the latest developments and advances in these fields.
Saturday, August 26, 2017
Uncovering discrimination in machine-learning software
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment