Rise of the racist robots - how AI is learning all our worst impulses
In May last year, a stunning report claimed that a computer program used by a US court for risk assessment was biased against black prisoners. The program, Correctional Offender Management Profiling for Alternative Sanctions (Compas), was much more prone to mistakenly label black defendants as likely to reoffend - wrongly flagging them at almost twice the rate as white people (45% to 24%), according to the investigative journalism organisation ProPublica.
Machine Learning, Artificial Intelligence, and Deep Learning News around the world. We publish the latest developments and advances in these fields.
Sunday, August 13, 2017
Rise of the racist robots – how AI is learning all our worst impulses
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment