Saturday, August 26, 2017

Uncovering discrimination in machine-learning software


Uncovering discrimination in machine-learning software -- GCN

It's no secret that machine learning-based algorithms can be problematic and even inject or boost bias in decision-making processes. Software used by courts for sentencing decisions has been shown to make harsher recommendations for defendants of color. Alexandra Meliou and Yuriy Brun, both assistant professors at the University of Massachusetts at Amherst, have found a new technique to automatically test software for discrimination.

No comments:

Post a Comment