Menu

D&S Fellow Angèle Christin was quoted in this piece on TruthDig dissecting biased sentencing algorithms.

“Because these algorithms are using past data to build the algorithm, and past data is by definition skewed because of the history of racial discrimination in the U.S. and everywhere, the result is the algorithms are necessarily reproducing these inequalities,” Angèle Christin, a postdoctoral fellow at the Data & Society Research Institute, told Truthdig. “While of course race is never included in the statistical models that are being used to build these algorithms, you have many variables that can serve as proxies. For example, some models rely on ZIP codes or the criminal records of family members, which are significantly correlated with race.”