Menu

Dec 16, 2016

12.16: nonconsensual image sharing; so long, 2016


updates and ideas from the D&S community and beyond

—calls—
Last call for 2017-18 D&S fellowship applications; deadline: MONDAY, DEC 19!
We’re also hiring a Human Resources Manager and a Research Analyst; apply by Dec 19 Jan 11.
NEW: Propaganda and Media Manipulation workshop; apply by Feb 15.

Around the Institute

New report shows that 4% of U.S. internet users have been a victim of nonconsensual pornography
Our new report with the Center for Innovative Public Health Research (CiPHR) offers the first national statistics on the prevalence of nonconsensual explicit image sharing, also known as “revenge porn.”

Nonconsensual image sharing” compliments our earlier report, also with CiPHR, covering the prevalence of online harassment and abuse more broadly.

So long, 2016…
We’re taking a short winter break from this newsletter, but before we sign off, a couple of reminders:

Points, the Medium publication we launched at the top of the year, is now home to more than fifty essays and provocations covering topics from bots to the future of labor to the public sphere in an era of algorithms and tons more. Read them all! (And big thanks to all the family and friends who contributed pieces this year.)

Catch up on Databites. Videos of Data & Society’s speaker series can be found on an Internet near you. Our latest is Bruce Schneier’s discussion of Security and Privacy in a Hyper-connected World.

Dear Subscriber, It looks like you enjoy newsletters. You might also like our Enabling Connected Learning project’s joint newsletter with the Youth and Media team at the Berkman Klein Center for Internet & Society at Harvard University (whew, love all those words): Student Privacy, Equity, and Digital Literacy.

Finally, for an overview of Data & Society’s second year, flip through our fancy 2015-2016 Report on Activities.

Onward.

More 2016 Highlights

Best Practices for Conducting Risky Research and Protecting Yourself from Online Harassment
Alice Marwick, Lindsay Blackwell, and Katherine Lo developed a set of best practices for researchers who wish to engage in research that may make them susceptible to online harassment.

Mediation, Automation, Power
“Can Facebook determine the outcome of the 2016 election?” Robyn Caplan and danah boyd asked in the introduction to this primer on algorithms and the public sphere. Bonus: Ethan Zuckerman on Fred Turner: The link from anti-fascist art and the “historical problem” of Facebook.

Auditing Black-box Models by Obscuring Features
“…we present a technique for auditing black-box models: we can study the extent to which existing models take advantage of particular features in the dataset without knowing how the models work.” —Philip Adler, Casey Falk, Sorelle A. Friedler, Gabriel Rybeck, Carlos Scheidegger, Brandon Smith, Suresh Venkatasubramanian

The hidden story of how metrics are being used in courtrooms and newsrooms to make more decisions
“Judges, prosecutors, lawyers, and police officers argued about charts, excel spreadsheets, paper clippings, and blackboards with numbers on them. Numbers were hotly debated, contested, and manipulated by everybody involved – except, of course, by the defendants, who were largely silent in the process.” —Angèle Christin

The Truth About How Uber’s App Manages Drivers
“Most conversations about the future of work and automation focus on issues of worker displacement. We’re only starting to think about the labor implications in the design of platforms that automate management and coordination of workers.” —Alex Rosenblat

Networks of New York: An Illustrated Field Guide to Urban Internet Infrastructure
Ingrid Burrington published a book that maps and makes visible the material infrastructure of the Internet in our fair city. Bonus: Welcome to Networks Land! Ingrid’s collection, with Surya Mattu, of educational activities for learning how the Internet works “behind and beyond the screen.”

Bundle of intelligence
Our Intelligence & Autonomy project dropped a cluster of terrific papers this year — Moral Crumple Zones; The Wisdom of the Captured; Regional Diversity in Autonomy and Work; Discriminating Tastes — not to mention an elegant little book, An AI Pattern Language.

An App to Save Syria’s Lost Generation?
Mark Latonero considered attempts by policymakers, big tech companies, and advocates to address with technology the refugee and migrant crisis and, in particular, the educational needs of displaced children.

Data, Tech, Learning
Our Enabling Connected Learning project published three primers, plus a raft of essays, on contentious areas of intersection between technology and education, including Personalized Learning, Accountability, and Advertising.

When Websites Won’t Take No for an Answer
Natasha Singer looked into techniques that “maneuver people into signing up for services they might not actually want.”

Perspectives on Big Data, Ethics, and Society
The Council for Big Data, Ethics, and Society released a white paper consolidating conversations and ideas from two years of meetings and discussions. More: All of the Council’s work and case studies are available here.

Where We Live and How We Die
“This gargantuan effort of classifying the world’s deaths is done with the goal of ensuring that all who die are counted. Each year the task becomes more complicated (consider that when the [Global Burden of Disease Study] began in 1990, it only encoded 120 causes of death; today, there are more than 350).” —Mimi Onuoha

Around the Around

External opportunities, events, and things Data & Society is reading
An abundance of interesting and useful activity is happening outside of D&S. Follow along with what we’re reading and check out relevant external events and opportunities on our site’s Links page.