updates and ideas from the D&S community and beyond
New Listen Episode: Marie Hicks on her latest book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing.
Around the Institute
Data & Society is delighted to announce its fourth class of fellows! The group’s dynamic range of experience and expertise spans art, investigative journalism, computer science, law, the history of science and technology, privacy studies, and more.
In addition to the new fellows, we will welcome an incoming Ford-Mozilla Open Web Fellow. Apply now to join us as an Open Web Fellow this fall!
“At the very least, perhaps it is time to more rigorously delineate the contours and parameters of the contemporary hybrid technology/media company, and to begin articulating if and how these companies should fit into existing legal, regulatory, and journalistic frameworks; or whether new or modified frameworks that reflect their hybrid nature need to be devised.”— Philip M. Napoli, Robyn Caplan
In honor of #ChoosePrivacy Week, librarians can find curriculum and learning modules on digital privacy for libraries via the Data Privacy Project. In addition, Data & Society is delighted to partner with the University of Wisconsin-Milwaukee Center for Information Policy Research and the American Library Association’s Office of Intellectual Freedom for “Library Values & Privacy in our National Digital Strategies,” a national forum exploring what the library value of privacy means in the digital world, generously funded by the Institute of Museum and Library Services.
Do fictional futures exercise power over our present? Applications due next Friday for Future Perfect, a multidisciplinary micro-conference on the uses, abuses, and paradoxes of speculative futures (to be held Friday, June 16).
Jobs, jobs, jobs!
Around the Around
“Algorithms aren’t going to go away, and I think we can all agree that they’re only going to become more prevalent and powerful. But unless academics, technologists and other stakeholders determine a concrete process to hold algorithms and the tech companies behind them accountable, we’re all at risk.” —Megan Rose Dickey
“The Compas report, a prosecutor told the trial judge, showed ‘a high risk of violence, high risk of recidivism, high pretrial risk.’ The judge agreed, telling Mr. Loomis that ‘you’re identified, through the Compas assessment, as an individual who is a high risk to the community.'” —Adam Liptak
“When asked about racially disparate policing practices, the [Taser] spokesperson said that the ‘huge gain in information fidelity and transparency in video (versus text) is something that we believe can identify such bias.'” —Ava Kofman