featured


D&S founder danah boyd’s prepared remarks for a public roundtable in the European Parliament on algorithmic accountability and transparency in the digital economy were adapted in this Points piece.

I believe that algorithmic transparency creates false hope. Not only is it technically untenable, but it obfuscates the real politics that are at stake.


D&S advisor Ethan Zuckerman wrote an op-ed detailing the issue of normalizing the abnormal.

My deep fear is that there’s no single set of Hallin’s spheres anymore. What’s consensus to a Trump supporter may be deviant to a Clinton supporter and vice versa. We now face an online media landscape so diverse and fragmented that each of us finds big enough spheres of legitimate controversy that we think we’re seeing a real debate at work.


D&S affiliate Angèle Christin was quoted in this Wired piece discussing the pervasiveness of election news online.

In that process, such conversations start to invade “social areas that are usually sheltered from heated political discussions,” says Angèle Christin, professor of communication at Stanford. She says social media, including seemingly anodyne environments like a parenting forum, actually accentuate the problem because they blend the private and public.


D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.

In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.


D&S researcher Robyn Caplan writes about fake news and Facebook in the NYT.

Media organizations were especially left behind when Facebook changed its algorithm in June to privilege friends and family over major publishers, which happens to have occurred just prior to the spike in fake news before the election. Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm.


D&S founder danah boyd writes her response to the recent “Online Harassment, Digital Abuse, and Cyberstalking in America” report.

27% of all American internet users self-censor what they post online out of fear of online harassment. Young people are especially prone to self-censorship. This is deeply disturbing. We often worry about free speech online, but we don’t consider the social factors that prompt people to censor themselves or the ways in which this impacts some groups more than others.


D&S post-doctoral scholar Caroline Jack responds to “Gig Work, Online Selling and Home Sharing” from Pew Research Center.

The more that is known about the workers and the work of the on-demand economy, the stronger the call for platform builders to make systems for sustainable work: systems that acknowledge the lived conditions and external factors that affect workers.


D&S researcher Mary Madden writes this Points piece defending polls.

What we need to move forward is a triangulation of methods — using a combination of qualitative and quantitative analysis and supporting a publishing process that elevates transparency and open critique.


report | 11.21.16

Social Media Use by Americans, 2016 (Data Memo)

Amanda Lenhart, Michelle Ybarra (CiPHR), Myeshia Price-Feeney (CiPHR)

This study, part of a larger study of online harassment, asked internet users whether they use three types of platforms: Social media sites like Facebook, LinkedIn, or Instagram; video games played online with other people; and online discussion sites like Reddit or Digg.


report | 11.21.16

Online Harassment, Digital Abuse, and Cyberstalking in America

Amanda Lenhart, Michele Ybarra (CiPHR), Kathryn Zickuhr, Myeshia Price-Feeney (CiPHR)

The internet and digital tools play an increasingly central role in how Americans engage with their communities: How they find and share information; how they connect with their friends, family, and professional networks; how they entertain themselves; how they seek answers to sensitive questions; how they learn about—and access—the world around them. The internet is built on the ideal of the free flow of information, but it is also built on the ideal of free-flowing discourse.

However, one persistent challenge to this ideal has been online harassment and abuse—unwanted contact that is used to create an intimidating, annoying, frightening, or even hostile environment for the victim and that uses digital means to reach the victim. As with their traditional expressions, online harassment and abuse can affect many aspects of our digital lives. Even those who do not experience online harassment directly can see it and respond to its effects; even the threat of harassment can suppress the voices of many of our citizens.

In order to explore these issues and the ways that online environments affect our experiences online, this report examines American teens’ and adults’ experiences with witnessing, experiencing, and responding to the aftermath of online harassment and abuse.


Download: full report | methods | press release | Social Media Use by Americans, 2016 (Data Memo)

Additional Reports: Nonconsensual Image Sharing | Intimate Partner Digital Abuse


Acknowledgements

This report was made possible by a grant from the Digital Trust Foundation. The authors would like to thank the Foundation for their support of this project. In addition to the named authors, we want to acknowledge and thank the other individuals who contributed to this report: Hannah Madison, Emilie Chen, Chantel Gammage, Alexandra Mateescu, Angie Waller, Seth Young, and Shana Kimball. We would also like to thank our advisors and reviewers for their help in thinking through the questions to ask and their feedback on the report. Our advisors and reviewers include danah boyd, Monica Bulger, Maeve Duggan, Rachel Hartman, Amanda Levendowski, and the team at the Safety Net Technology Project at the National Network to End Domestic Violence.


points | 11.18.16

Phones, but No Papers

Julia Ticona

D&S post-doctoral scholar Julia Ticona responds to “Gig Work, Online Selling and Home Sharing” from Pew Research Center.

Contingent work has always been prevalent in communities where workers have been historically excluded from secure jobs, from union membership, and even from wider public forms of social welfare through systemic forms of discrimination. For these workers, there was no “golden era” of plentiful stable work and a strong social safety net. Despite these long-standing trends, emerging forms of on-demand labor, and the data-driven technologies that workers interact with, can deepen the vulnerabilities of certain populations of workers.


D&S advisor Christina Xu writes about fake news and conspiracy theories in China.

Here in China, even well-educated and progressive friends have sincerely asked me about some pretty niche conspiracies. Did Hillary really assassinate someone? (No.) Didn’t Trump win 90% of the vote? (No.) Yesterday, someone even mentioned that they really liked a poem he wrote about his vision for America’s future. (What.)


D&S advisor Baratunde Thurston jumps into the debate of the place of empathy in a post-Trump election.


Harvard Business Review | 11.17.16

What Motivates Gig Economy Workers

Alex Rosenblat

D&S researcher Alex Rosenblat writes about the motivations of gig economy workers.

In sum, the effects of the gig economy on the workforce are mixed. These platforms seem to benefit people earning supplementary income or those lacking other job opportunities the most, while they impose the most risk on full-time earners. And Uber and Lyft are still facing legal challenges in the U.S. for classifying drivers as independent contractors, as opposed to employees who can receive benefits. (In the U.K., an employment tribunal recently ruled that two Uber drivers must receive employee benefits, like the national living wage. Uber plans to appeal that ruling.)


MIT Technology Review | 11.17.16

How to Hold Algorithms Accountable

Nicholas Diakopoulos, Sorelle Friedler

D&S affiliate Sorelle Friedler, with Nicholas Diakopoulos, discuss five principles to hold algorithmic systems accountable.

Recent investigations show that risk assessment algorithms can be racially biased, generating scores that, when wrong, more often incorrectly classify black defendants as high risk. These results have generated considerable controversy. Given the literally life-altering nature of these algorithmic decisions, they should receive careful attention and be held accountable for negative consequences.


D&S researcher Tim Hwang participates in episode 11 of Big Thinkers.


D&S advisor Deirdre Mulligan, with co-authors Colin Koopman and Nick Doty, released “Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy”.

The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts.


D&S advisor Ethan Zuckerman provides his thoughts on the recent 2016 presidential election and insurrectionism.

I didn’t want to see a Trump presidency, and the rise of insurrectionism to the highest levels of the American government scares the crap out of me. But scarier is the endless blame game I hear my allies engaged in, figuring out whether we blame the media, the FBI or anyone other than ourselves for this loss. We have a brief opportunity to figure out how to make social change in an age of high mistrust and widespread insurrectionism. It would be a shame if Donald Trump figured out how to harness this power and the progressives lined up against him failed to do so.


D&S affiliate Gideon Lichfield wrote this op-ed about the recent US presidential election


Tampa Bay Times | 11.11.16

How data failed us in calling an election

Steve Lohr, Natasha Singer

D&S affiliate Natasha Singer and Steve Lohr discuss the role of data in the 2016 election.

Virtually all the major vote forecasters, including Nate Silver’s FiveThirtyEight site, the New York Times‘ Upshot and the Princeton Election Consortium, put Clinton’s chances of winning in the 70 percent to 99 percent range. The election prediction business is one small aspect of a far-reaching change across industries that have increasingly become obsessed with data, the value of it and the potential to mine it for cost-saving and profitmaking insights. It is a behind-the-scenes technology that quietly drives everything from the ads that people see online to billion-dollar acquisition deals.


D&S fellow Zara Rahman writes about digital access and control.

Recognising the political importance of our technical decisions is within reach, leading ultimately to reclaiming power and control of our activism in the digital sphere as well as in the offline world.


CultureDigitally.org | 11.11.16

more Culture Digitally scholars reflect on the election and our scholarship going forward

Tarleton Gillespie, Mary Gray, Kate Miltner, Ted Striphas, Ilana Gershon, Paul Hillier, Mike Annany, Hector Postigo

D&S advisor Tarleton Gillespie, with several scholars, continues sharing essays on the U.S. election and the implications on scholarship.

As we said yesterday, we know the scholars in this community cannot address every issue that’s likely on the horizon, but we think our work touches a surprising number of them. The kinds of questions that motivate our scholarship — from fairness and equity, to labor and precarity, to harassment and misogyny, to globalism and fear, to systems and control, to journalism and ignorance — all of these seem so much more pressing now then they did just a few days ago. If we’re going to have four years of a Trump presidency and all that goes with it, and one way we can respond is through our scholarship, we want to start today in stridently moving that forward.


CultureDigitally.org | 11.10.16

at Culture Digitally, we’re thinking about our scholarship in the harsh light of this week

Tarleton Gillespie, Hector Postigo, Jonathan Sterne, Thomas Streeter, Seth Lewis, Kate Zyskowski, Tom Boellstorff, Shawn Walker, Josh Braun, Jesse Houf, Robert Gehl, Jessa Lingel, Sarah Banet-Weiser

D&S advisor Tarleton Gillespie joins other scholars in considering the place of research and scholarship in a post-Trump USA.

But as scholars, we do a disservice to allow for simple or single explanations. “Perfect storm” has become a cliche, but I can see a set of elements that had to all be true, that came together, to produce the election we just witnessed: Globalization, economic precarity, and fundamentalist reactionary responses; the rise of the conservative right and its target tactics, especially against the Clintons; backlashes to multiculturalism, diversity, and the election of President Obama; the undoing of the workings and cultural authority of journalism; the alt-right and the undercurrents of social media; the residual fear and anxiety in America after 9/11. It is all of these things, and they were all already connected, before candidate Trump emerged.


D&S founder danah boyd opines on the usefulness of election polls and calls for the cessation of these polls.

It’s now time for the media to put a moratorium on reporting on election polls and fancy visualizations of statistical data. And for data scientists and pollsters to stop feeding the media hype cycle with statistics that they know have flaws or will be misinterpreted as fact.


D&S founder danah boyd asserts that the media must be held accountable in producing and disseminating information.

We live in a world shaped by fear and hype, not because it has to be that way, but because this is the obvious paradigm that can fuel the capitalist information architectures we have produced.


D&S advisor Anil Dash calls for action after the 2016 election.

But the work to be done in fighting Donald Trump is not unprecedented. All of us who are targets of his rhetorical attacks and his proposed policies can look back at history and see times when we’ve faced down similar threats—and won. It is only because progress has been made that we feel so gutted by this loss. And this is not, as some would say, the last gasp of old oppressions, it is simply another dark milestone in a fight against injustice that will never end.


D&S advisor Micah Sifry discusses the effectiveness of  data-driven campaigns.

This is the two-dimensional world that data-driven campaigning is optimized for. It gets results at the margins, and tomorrow many Democratic professionals will congratulate themselves for another job well done. If the people at Trump’s rallies, and at Sanders’, taught us anything this year, it’s that being a name in a file isn’t enough. A political party can’t just be a vehicle for gathering votes. And the work of politics isn’t over when a campaign ends. It’s just beginning.


D&S founder danah boyd spoke at the Algorithmic Accountability and Transparency in the Digital Economy panel at the EU Parliament.


D&S researcher Jacob Metcalf adds his perspective to the D&S Weapons of Math Destruction response.

Algorithms not only model, they also create. For researchers grappling with the ethics of data analytics, these feedback loops are the most challenging to our familiar tools for science and technology ethics.


D&S affiliate Natasha Singer profiles how high school students are encouraged to develop their social media personas for college admissions – particularly Linkedin.

Now some social media experts are advising high school seniors to go even further. They are coaching students to take control of their online personas — by creating elaborate profiles on LinkedIn, the professional network, and bringing them to the attention of college admissions officers.


D&S artist-in-residence Heather Dewey-Hagborg’s work was recently profiled on Blumhouse.

Undoubtedly the most shocking (but fascinating) of Dewey-Hagborg’s projects is “Stranger Visions,” which she launched in 2013 to much acclaim. The work not only stunned the art community, but sparked the curiosity of the scientific world… for disturbing reasons I’ll get into shortly.


D&S advisor Susan Crawford discusses the possibility and implications of an AT&T and Time Warner Cable merger.

The high-speed internet access market in America is entirely stuck on a expensive plateau of uncompetitive mediocrity, with only city fiber networks providing a public option or, indeed, any alternative at all. The AT&T/TWX deal will not prompt a drop of additional competition in that market. Nor will it mean that the entertainment industry will see more competition or new entrants — just that one player will get an unfair distribution advantage. It’s hard to think of a single positive thing this merger will accomplish, other than shining a bright light on just how awful the picture is for data transmission in this nation.

This deal should be dead on arrival. In fact, AT&T should spare us by dropping the idea now. This merger must not happen.


The net result of this batshit crazy election cycle is a Distributed Denial of Service attack on democracy. Like a webserver brought to its figurative knees by an endless flood of malformed requests, we are beginning to melt down under the avalanche of craziness. We’re left with the impression that this is an election between the possibly shady but unfairly attacked versus the truly unhinged… or between the thoroughly corrupt insider whos managed to undermine both government and the media versus the rough, offensive and often outrageous outsider who’s the only man she couldn’t bring down. We can’t move beyond those impressions because we are drowning in controversies and conspiracies, with very little help in understanding which matter and which we should take seriously.


D&S fellow Anne L. Washington published a Points piece responding to Cathy O’Neil’s Weapons of Math Destruction.

Complex models with high stakes require rigorous periodic taste tests. Unfortunately most organizations using big data analytics have no mechanism for feedback because the models are used in secrecy.

Producing predictions, like making sausage, is currently an obscure practice. If botulism spreads, someone should be able to identify the supply chain that produced it. Since math is the factory that produces the sausage that is data science, some form of reasoning should be leveraged to communicate the logic behind predictions.


interview | 10.31.16

Heather Dewey-Hagborg Questions DNA as Big Data

Joel Kuennen, Heather Dewey-Hagborg

D&S artist-in-residence Heather Dewey-Hagborg’s work was recently profiled in ArtSlant.

At the same time, DNA extraction and sequencing has never been cheaper or easier. In light of this and the continued reliance on DNA as forensic proof, artist Heather Dewey-Hagborg approaches the cultural conception of DNA through a hacker mindset, exploiting vulnerabilities in our legal code to expose society’s unwarranted reliance on DNA as an object of truth.


D&S artist-in-residence Ingrid Burrington provides an analysis comparing statements from Hannah Arendt’s writings to today’s election climate.

Considered one of the most important works of political philosophy of its time, today Origins reads more like a contemporary analysis of this election cycles’s post-fact landscape—the one driven, mostly, by Donald Trump’s candidacy. While we don’t have the horrors of concentration camps and gulags yet, the political propagandizing and systematic organizing behind the genocidal totalitarian regimes that Origins describes could have been ripped from this year’s election headlines.


points | 10.27.16

Shining a light on the darkness

Mark Van Hollebeke

D&S practitioner-in-residence Mark Van Hollebeke discusses Weapons of Math Destruction in this Points piece.

O’Neil’s analysis doesn’t just apply to mathematical models; it applies to societal models. Most of the WMDs that Cathy O’Neil describes are inextricably linked to unjust social structures.

We all, data scientists included, need to act with some humility and reflect on the nature of our social ills. As O’Neil writes, “Sometimes the job of a data scientists is to know when you don’t know enough” (216). Those familiar with Greek moral philosophy know that this type of Socratic wisdom can be very fruitful.

It’s not just the dark side of Big Data she shows us, but shady business practices and unjust social regimes. We will never disarm the WMDs without addressing the social injustice they mask and perpetuate. O’Neil deserves credit for shining a bright light on this fact.


D&S founder danah boyd was recently interviewed for 52 Insights.

As she fires out her progressive opinions at rapid speed, what becomes immediately apparent is just how immensely passionate she is about her work. Her research revolves around the world of new technologies, social media and today’s youth and how they all fit together in our society. She is also a Principal Researcher at Microsoft and founder of the Data & Society Research Institute. As we discover, she is a fervent defender of young people and admonishing of her own generation’s eagerness to place blame on them. At the end of the day, her work is very much about the notion of equality and how we can create it with these many new tools we have.

We believe danah boyd has some very important things to say, and with over 100,000 followers on Twitter, her voice is already being heard.


D&S advisor Susan Crawford discusses the internet shutdown of October 21.

Without the directional signs in place, suddenly huge numbers of sites couldn’t be found. Who knew the Internet of Things could have such a big effect on our daily lives?

Actually, a lot of people knew. IoT is very big business these days.

While we’re patching those insecure home DVRs, routers, and webcams, let’s back up and talk about the implications of IoT for public values generally. Because it’s not just websites that could be affected by unrestrained Internet of Things deployments. We’re not just using IoT in our homes. We’re also going to be using it, in a big way, in the places where 80 percent of Americans live, work, and play: in cities.


D&S fellow Ravi Shroff examines Cathy O’Neil’s analysis of criminal justice algorithms, like predictive policing.

There are a few minor mischaracterizations and omissions in this chapter of Weapons of Math Destruction that I would have liked O’Neil to address. CompStat is not, as she suggests, a program like PredPol’s. This is a common misconception; CompStat is a set of organizational and management practices, some of which use data and software. In the section on stop-and-frisk, the book implies that a frisk always accompanies a stop, which is not the case; in New York, only about 60% of stops included a frisk. Moreover, the notion of “probable cause” is conflated with “reasonable suspicion,” which are two distinct legal standards. In the section on recidivism, O’Neil asks of prisoners,

“is it possible that their time in prison has an effect on their behavior once they step out? […] prison systems, which are awash in data, do not carry out this highly important research.”

Although prison systems may not conduct this research, there have been numerous academic studies that generally indicate a criminogenic effect of harsh incarceration conditions. Still, “Civilian Casualties” is a thought-provoking exploration of modern policing, courts, and incarceration. By highlighting the scale and opacity of WMDs in this context, as well as their vast potential for harm, O’Neil has written a valuable primer for anyone interested in understanding and fixing our broken criminal justice system.


D&S fellow Mark Ackerman develops a checklist to address the sociotechnical issues demonstrated in Cathy O’Neil’s Weapons of Math Destruction.

These checklist items for socio-technical design are all important for policy as well. Yet the book makes it clear that not all “sins” can be reduced to checklist form. The book also explicates other issues that cannot easily be foreseen and are almost impossible for implementers to see in advance, even if well-intentioned. One example from the book is college rankings, where the attempt to be data-driven slowly created an ecology where universities and colleges paid more attention to the specific criteria used in the algorithm. In other situations, systems will be profit-generating in themselves, and therefore implemented, but suboptimal or societally harmful — this is especially true, as the book nicely points out, for systems that operate over time, as happened with mortgage pools. Efficiency may not be the only societal goal — there is also fairness, accountability, and justice. One of the strengths of the book is to point this out and make it quite clear.


points | 10.26.16

Models in Practice

Angèle Christin

D&S affiliate Angèle Christin writes a response piece to Cathy O’Neil’s Weapons of Math Destruction.

One of the most striking findings of my research so far is that there is often a major gap between what the top administrations of criminal courts say about risk scores and the ways in which judges, prosecutors, and court officers actually use them. When asked about risk scores, higher-ups often praise them unequivocally. For them, algorithmic techniques bear the promise of more objective sentencing decisions. They count on the instruments to help them empty their jails, reduce racial discrimination, and reduce expenses. They can’t get enough of them: most courts now rely on as many as four, five, or six different risk-assessment tools.

Yet it is unclear whether these risk scores always have the meaningful effect on criminal proceedings that their designers intended. During my observations, I realized that risk scores were often ignored. The scores were printed out and added to the heavy paper files about defendants, but prosecutors, attorneys, and judges never discussed them. The scores were not part of the plea bargaining and negotiation process. In fact, most of judges and prosecutors told me that they did not trust the risk scores at all. Why should they follow the recommendations of a model built by a for-profit company that they knew nothing about, using data they didn’t control? They didn’t see the point. For better or worse, they trusted their own expertise and experience instead.


D&S researcher Josh Scannell responds to Georgetown Center on Privacy & Technology’s “The Perpetual Line-Up” study.

Reports like “The Perpetual Line-Up” force a fundamental question: What do we want technologies like facial recognition to do? Do we want them to automate narrowly “unbiased” facets of the criminal justice system? Or do we want to end the criminal justice system’s historical role as an engine of social injustice? We can’t have both.


D&S artist-in-residence Ingrid Burrington analyzes the infrastructure factor of the Oct. 21 internet outage.

The locations of internet exchanges tend to follow population hubs because the routes of internet connectivity often follow older routes of telephone connectivity (which themselves often follow telegraph routes, railways, and highways). In turn, internet exchanges attract data centers and more network infrastructure. For dense coastal areas, some internet exchanges are also key switch points for data traveling across transoceanic submarine cables, as in the case of Manhattan’s 60 Hudson Street or Los Angeles’ One Wilshire. In all likelihood, devices used for Friday’s DDoS attack located across the Atlantic or Pacific probably passed through or possibly connected to Dyn’s network through these buildings.

That being said, some of the overlaps between population centers and network outages are more a reflection of the number of connections in an area than the number of humans living there. The Portland, Oregon metro area has six IXes. So does Manhattan, which is surrounded by nine additional IXes in the surrounding metro areas of New Jersey and Long Island. Dallas, Silicon Valley, and Seattle were all areas that were subsumed by the grim red cloud of No Tweets For You in outage maps yesterday.


D&S advisor Baratunde Thurston participates in this episode CodeNewbie podcast, Comedy and Code – Part I.


The New York Times | 10.24.16

Next Job for Obama? Silicon Valley Is Hiring

Michael D. Shear, Natasha Singer

D&S affiliate Natasha Singer with Michael D. Shear co-wrote this piece discussing President Obama’s legacy in technology and how he could continue to contribute in the future.


D&S researcher Claire Fontaine looks at how school performance data can lead to segregation.

In our technocratic society, we are predisposed toward privileging the quantitative. So, we need to find ways to highlight what is truly helpful in the data, but also insert an element of creative distrust. We need to encourage data consumers to think deeply about their values, rather than using data to reify knee-jerk prejudicial attitudes. Data scientists and engineers are in the position to help shift the conversation around data as truth.


D&S researcher Amanda Lenhart participates in the Relationships and Privacy in a World of Tinder and Twitter panel at Kids Online.


D&S post-doctoral scholar Caroline Jack attended the O’Reilly Next:Economy Summit and continues the conversation from the summit.

In the next economy, the most important skills may be difficult to quantify or commodify—but optimizing for human welfare demands that the people driving the innovation economy take them seriously. Care work requires workers to build trust and practice kindness. It is “emotional labor” that demands skills such as calmness, empathy and interpersonal creativity. Given this outlook, the greatest victory of our tech industry could be in turning away from systems that incentivize efficiency and profit and toward designing systems that optimize workers’ and consumers’ dignity, sustenance and welfare.


paper | 10.19.16

Discriminating Tastes: Customer Ratings as Vehicles for Bias

Alex Rosenblat, Karen Levy, Solon Barocas, Tim Hwang

D&S researchers Alex Rosenblat and Tim Hwang and D&S affiliates Solon Barocas and Karen Levy examine how bias may creep into evaluations of Uber drivers through consumer-sourced rating systems:

Through the rating system, consumers can directly assert their preferences and their biases in ways that companies are prohibited from doing on their behalf. The fact that customers may be racist, for example, does not license a company to consciously or even implicitly consider race in its hiring decisions. The problem here is that Uber can cater to racists, for example, without ever having to consider race, and so never engage in behavior that amounts to disparate treatment. In effect, companies may be able to perpetuate bias without being liable for it.”


D&S fellow Zara Rahman introduces her upcoming research at Data & Society, where she will examine the work of translators in technology projects.

Whatever it’s called, it’s also under-appreciated. In our tech-focused world, we often hold those with so-called “hard” programming skills up on a pedestal, and we relegate those with “soft” communication skills to being invisible caretakers. It’s not an accident that this binary correlates strongly with traditionally male-dominated roles of programming and largely female-dominated roles of community management or emotional labour. It’s worth noting too, that one is paid much more than the other.


Researchers who investigate sensitive topics may face online harassment, social shaming, or other networked forms of abuse. In addition to potential impacts on the researcher’s reputation and mental health, fear of harassment may have a chilling effect on the type of research that is conducted and the capabilities of individual researchers.

This document is a set of best practices for researchers – especially junior researchers – who wish to engage in research that may make them susceptible to online harassment. We provide recommendations for academic institutions, supervisors, and individuals, including cyber-security guidelines and links to other resources.

We’ve also created a 2-page information sheet that researchers can give to university personnel to educate them about the realities of online harassment and what administrators can do about it.

The authors welcome feedback about this document. Please send suggestions and edits to riskyresearch at datasociety dot net.


D&S research analyst Mikaela Pitcan discusses how missing data can impact how students with mental health conditions.

The areas in which data are lacking communicate priorities. However, without concrete data to show a need to prioritize the issue of mental health in schools, there is little incentive to make this issue a priority. Is the failure to account for students with mental illness in a detailed manner the result of stigma? Is it the result of a broader culture that idealizes childhood and is unable to integrate the idea of children struggling with mental illness into our collective consciousness? How might big data be used to identify children in need of mental health treatment in schools to target intervention while protecting students’ privacy? In an age where incredibly detailed information is collected, some students’ needs remain invisible. How can we use the data we have to address the need for the data that is missing?


D&S advisor Ethan Zuckerman responds to criticism for donating to the North Carolina GOP office’s reconstruction.

It’s also possible that kindness is the single most important and powerful thing you can do to make change in the world. Consider the story of Derek Black, who inherited a leadership role in the White Nationalist movement from his father, the founder of the Stormfront message board community. A fellow student at New College in Sarasota, Florida reached out to Black, inviting him to an interfaith shabbat dinner, not to confront him about his beliefs, but simply to reach out and include him. This kindness proved transformative — at great cost to his relationships with his family, Black has forsaken white nationalism.


D&S advisor Catherine Bracy recaps the Tech Equity Collaborative info session.


Nature | 10.13.16

There is a blind spot in AI research

Kate Crawford, Ryan Calo

D&S affiliate Kate Crawford, with Ryan Calo, wrote this piece discussing risks in AI.

Artificial intelligence presents a cultural shift as much as a technical one. This is similar to technological inflection points of the past, such as the introduction of the printing press or the railways. Autonomous systems are changing workplaces, streets and schools. We need to ensure that those changes are beneficial, before they are built further into the infrastructure of every­day life.


D&S affiliate Ifeoma Ajunwa testified at the U.S. Equal Employment Opportunity Commission to discuss big data in the workplace.

Good afternoon, Chair Yang and members of the Commission. First, I would like to thank the Commission for inviting me to this meeting. My name is Ifeoma Ajunwa, I am a Fellow at the Berkman Klein Center at Harvard University and an Assistant Professor at the University of the District of Columbia School of Law. I have authored several papers regarding worker privacy, with an emphasis on health law and genetic discrimination, from which my testimony today is largely drawn.

Today, I will summarize a number of practices that employers have begun to deploy to collect information on employees, and my concerns that such information could ultimately be acquired and sold by data brokers or stored in databanks. There are few legal limitations on how this sensitive information could be used, sold, or otherwise disseminated. Absent careful safeguards, demographic information and sensitive health information and genetic information is at risk for being incorporated in the Big Data analytics technologies that employers are beginning to use — and which challenge the spirit of antidiscrimination laws such as the Americans with Disabilities Act (the “ADA”) and the Genetic Information Non-Discrimination Act (“GINA”).


D&S researchers Alex Rosenblat and Tim Hwang explore “the significant role of worker motivations and regional political environments on the social and economic outcomes of automation” in this new paper.

Preliminary observations of rideshare drivers and their changing working conditions reveals the significant role of worker motivations and regional political environments on the social and economic outcomes of automation. Technology’s capacity for social change is always combined with non-technological structures of power—legislation, economics, and cultural norms.


D&S advisor Susan Crawford discusses how streetlights are becoming a part of the Internet of Things.

But the third step was the charm: This past summer, Santa Monica adopted an ordinance requiring that wireless carriers get access to Santa Monica’s streetlights and traffic signal poles only on a neutral basis. It also sets design requirements for these rights-of-way assets, emphasizing the need for nice-looking poles that conceal gear. But the important thing is that carriers will not be able, in the words of former Santa Monica CIO Jory Wolf, to “delay or preclude” competition. The desired result: no one can lock up these poles.


ProPublica | 10.12.16

Breaking the Black Box: When Machines Learn by Experimenting on Us

Julia Angwin, Terry Parris Jr., Surya Mattu, Seongtaek Lim

D&S affiliate Surya Mattu, with Julia Angwin, Terry Parris Jr., and Seongtaek Lim, continue the Black Box series.

Depending on what data they are trained on, machines can “learn” to be biased. That’s what happened in the fall of 2012, when Google’s machines “learned” in the run-up to the presidential election that people who searched for President Obama wanted more Obama news in subsequent searches, but people who searched for Republican nominee Mitt Romney did not. Google said the bias in its search results was an inadvertent result of machine learning.

Sometimes machines build their predictions by conducting experiments on us, through what is known as A/B testing. This is when a website will randomly show different headlines or different photos to different people. The website can then track which option is more popular, by counting how many users click on the different choices.


paper | 10.11.16

Automatically Processing Tweets from Gang-Involved Youth: Towards Detecting Loss and Aggression

Terra Blevins, Robert Kwiatkowski, Jamie C. Macbeth, Kathleen Mckeown, Desmond Patton, Owen Rambow

D&S affiliate Desmond Patton, with Terra Blevins, Robert Kwiatkowski, Jamie C. Macbeth, Kathleen Mckeown, and Owen Rambow, wrote this paper exploring a body of texts from a female gang member and examine patterns of speech that indicate an aggression trigger.

Violence is a serious problems for cities like Chicago and has been exacerbated by the use of social media by gang-involved youths for taunting rival gangs. We present a corpus of tweets from a young and powerful female gang member and her communicators, which we have annotated with discourse intention, using a deep read to understand how and what triggered conversations to escalate into aggression. We use this corpus to develop a part-of-speech tagger and phrase table for the variant of English that is used, as well as a classifier for identifying tweets that express grieving and aggression.


D&S researcher Bonnie Tijerina discusses the development of a “hands-on professional training program on data and privacy literacy in hopes of showing how this knowledge can positively impact their service to library patrons.”


D&S researcher Mary Madden reviews and adds her perspective to the play Privacy.

As someone who has experienced the glaze of overwhelm in the eyes of my family and friends when I try to explain why privacy still matters, I have a profound respect for anyone who can articulate that message clearly.

One thing that writers of survey research questions and writers of dramatic scripts have in common is the challenge of accurately and clearly describing people’s engagement with technology.


D&S advisor Baratunde Thurston details his frustrations with Trump supporters.

I want to live in the world where the second chances Donald Trump has received thousands of times are redistributed to others who deserve and would do more with such chances. For with all his second chances, what has he done for others? But that black teenager could become a legitimate business owner. That Syrian refugee could create great art. That migrant worker could revolutionize our education system. And the millions currently incarcerated, largely for non violent offenses, could return to their communities as assets.


D&S affiliate Wilneida Negrón writes five tips to allow for more inclusive AI research.

Although, a step in the right direction, the Partnership on AI does highlight a certain conundrum — what exactly is it that we want from Silicon Valley’s tech giants? Do we want a seat at their table? Or are we asking for a deeper and more sustaining type of participation? Or perhaps, more disturbingly, is it too late for any truly inclusive and meaningful participation in the development of future AI technologies?


D&S fellow Zara Rahman reports that the Bangladeshi government developed “smart” national ID cards.

But with this much personal information being collected on every single citizen, especially personal data that cannot be changed if it is ever leaked or compromised (ie. the fingerprints of an individual), there are major concerns regarding the security of this data. A breach or leak could put individuals privacy rights seriously at risk.


paper | 10.24.16

The Class Differential in Big Data and Privacy Vulnerability

Mary Madden, Michele Gilman, Karen Levy, Alice Marwick

Introduction:

Low-income communities have historically been subject to a wide range of governmental monitoring and related privacy intrusions in daily life. The privacy harms poor communities and their residents suffer as a result of this pervasive surveillance are especially acute when considering the economic and social consequences they experience, and the low likelihood that they will be able to bear the costs associated with remedying those harms. In the “big data” era, there are growing concerns that low-status internet users may be further differentially impacted by certain forms of internet-enabled data collection, surveillance, and marketing. They may be both unfairly excluded from opportunities and unfairly targeted based on determinations made by predictive analytics and scoring systems—growing numbers of which rely on some form of social media input. These new kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, could have particularly negative impacts on the poor.

In addition to the harms created by targeting (e.g., predatory marketing) or exclusion from opportunity, the poor may face magnified privacy vulnerabilities as a result of community-specific patterns around technology use, and knowledge gaps about privacy- and security-protective tools. Legal scholars have identified a broad group of consumers as “privacy vulnerable” when they “misunderstand the scope of data collection and falsely believe that relevant privacy rights are enshrined in privacy policies and guaranteed by law.” These misconceptions are common across all socioeconomic categories, but this article suggests that these conditions may be exacerbated by poor communities’ higher reliance on mobile connectivity and lower likelihood to take various privacy-protective measures online. When low-income adults rely on devices and apps that make them more vulnerable to surveillance, and they wittingly or unwittingly do not restrict access to the content they post online, they may be further exposed to forms of commercial data collection that can affect the way they are assessed in various employment, education and law enforcement contexts.

Part I of this article provides a historical overview of the ways in which the poor have been subject to uniquely far-reaching surveillance across many aspects of life, and how their experiences of harm may be impacted by evolving practices in big-data-driven decision making. In using the term “poor” to signify a condition of economic deprivation, this article recognizes that low-income people in America are a diverse and multifaceted group and that each person has his or her own individualized narrative. Despite this diversity, this article highlights a shared reality for many poor people, which is heightened vulnerability to on-line surveillance and associated adverse outcomes.  Part II presents new empirical findings from a nationally representative survey to highlight various technology-related behaviors and concerns that suggest low-status internet users may be especially vulnerable to surveillance and networked privacy-related harms. In Part III, we show why and how this matters through a legal examination of several timely case studies that demonstrate how on-line activity, and the emerging use of  social media data in particular, might have detrimental impacts on the poor when used in high-stakes decision-making systems. This Part explains why current legal frameworks fail to shield the poor from negative outcomes. Finally, in Part IV, we assess major proposals for protecting on-line, personal data through the lens of class vulnerability.  In other words, we evaluate how these proposals might impact poor people.  We agree with other scholars that additional technical and non-technical reforms are needed to address the risks associated with the use of social media data.  As policymakers consider reforms, we urge greater attention to impacts on low-income persons and communities.


D&S advisor Anil Dash examines how to change the tech industry for good.

Some of the most novel critiques about technology and Silicon Valley are coming from women and underrepresented minorities, but their work is seldom recognized in traditional critical venues. As a result, readers may miss much of the critical discourse about technology if they focus only on the work of a few, outspoken intellectuals.


D&S artist-in-residence Ingrid Burrington details the infrastructure of the internet and who owns the physical pieces of this infrastructure.

The miles of fiber-optic cable that connect the country to the world are mostly visible in fragments—signs for buried cable along the side of a road, telephone poles running between houses forming serpentine labyrinths beneath city manhole covers. These glimpses of networks tend to feature the names of at least a few different companies—sometimes well-known ones, like AT&T, but just as often companies that don’t tend to become household brands or that haven’t existed for years. US-West, MCI/Worldcom, Embarq. It’s hard to really get a grasp of who, exactly, owns all of this stuff and how they came to own it.


D&S affiliate Anthony Townsend gives a history of city charters.

It’s pretty clear that Smart Cities 1.0 was always going to take cities in a bad direction — and its why I wrote my book. Cities have clearly responded, and the city-led Smart Cities 2.0 model is clearly ascendant — most clearly reflected in the proliferation of smart city campaigns, visions and digital master plans (see my 2015 paper with Stephen Lorimer, who now is on Smart London team at the Greater London Authority, where we compare the content, planning process, and implementation approach of 8 cities’ digital plans: “Digital Master Planning: Am Emerging Strategic Practice in Global Cities”)


paper | 10.02.16

Exploring or Exploiting? Social and Ethical Implications of Autonomous Experimentation in AI

Sarah Bird, Solon Barocas, Kate Crawford, Fernando Diaz, Hanna Wallach

Sarah Bird, Fernando Diaz, Hanna Wallach, with D&S affiliates Solon Barocas and Kate Crawford, wrote this analysis about implications in autonomous experimentation in AI.

In the field of computer science, large-scale experimentation on users is not new. However, driven by advances in artificial intelligence, novel autonomous systems for experimentation are emerging that raise complex, unanswered questions for the field. Some of these questions are computational, while others relate to the social and ethical implications of these systems. We see these normative questions as urgent because they pertain to critical infrastructure upon which large populations depend, such as transportation and healthcare. Although experimentation on widely used online platforms like Facebook has stoked controversy in recent years, the unique risks posed by autonomous experimentation have not received sufficient attention, even though such techniques are being trialled on a massive scale. In this paper, we identify several questions about the social and ethical implications of autonomous experimentation systems. These questions concern the design of such systems, their effects on users, and their resistance to some common mitigations.

 


D&S advisor Alondra Nelson was interviewed for PBS NewsHour about her book, “The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome.”

That’s the critical piece, because we know for communities of color, that genetics has not always been a rosy piece of research. I mean, that there have been historical tragedies in the past that would lead particularly African-Americans to be suspicious of genetic testing.

And so, the ability to opt-in, the ability to now in the 21st century use genetics to do something powerful, to tell a powerful story about your identity and your life, and to choose how you want to take that story up. So, sometimes people get information that they find useful or interesting, and sometimes they don’t. But because you have opted in as a consumer, you get to choose, you get to adjudicate whether or not you think that information is useful for your story.

 

 


D&S advisor Anil Dash discusses how to make change with tech.

On Monday, I’ll be leading a conversation at the White House’s South by South Lawn festival. We’ll be talking with three people who are both extraordinary and strikingly ordinary: Brittany Packnett, the Vice President of National Community Alliances at Teach For America and a co-founder of Campaign Zero; Carmen Rojas, PhD, the CEO of The Workers Lab; and Evan Wolfson, founder and president of Freedom to Marry.

They’re three people from wildly different backgrounds, each working on a spate of complex issues. But what they have in common is that they rejected cynicism, they saw that they could have a role in leading change, and they’ve worked to make it happen. In our discussion, we’ll find out how they took that leap, and maybe learn how each of us can focus on the issues we care about and maybe even think about how we can “change the world”.


D&S affiliate Karen Levy is interviewed about her scholarly work.


book | 09.29.16

Object-Oriented Feminism

Katherine Behar, Josh Scannell

D&S research R. Joshua Scannell contributes to Object-Oriented Feminism.


book | 09.29.16

An AI Pattern Language

Madeleine Clare Elish, Tim Hwang

D&S researchers Madeleine Clare Elish and Tim Hwang discuss the social challenges of AI in a new collection of essays, An AI Pattern Language.

In A Pattern Language, the central problem is the built environment. While our goal here is not as grand as the city planner, we took inspiration from the values of equity and mutual responsibility, as well as the accessible form, found in A Pattern Language. Like those patterns, this document attempts to develop a common language of problems and potential solutions that appear in different context and at different scales of intervention.

 


ProPublica | 09.28.16

Breaking the Black Box: What Facebook Knows About You

Julia Angwin, Terry Parris Jr., Surya Mattu

Julia Angwin, Terry Parris Jr., and D&S affiliate Surya Mattu explore what Facebook knows about its users.

We built a tool that works with the Chrome Web browser that lets you see what Facebook says it knows about you — you can rate the data for accuracy and you can send it to us, if you like. We will, of course, protect your privacy. We won’t collect any identifying details about you. And we won’t share your personal data with anyone.


D&S fellow Zara Rahman explores the need for access to information and open data, i.e. the right to know.

Given these growing threats, combined with our increased knowledge of government secrecy and surveillance, and new possibilities through widespread technologies, it feels like we should be focusing more than ever on strengthening our right to information. This means directing funding towards it, supporting the established RTI community, and directing resources towards exercising our right to information when we can.


D&S affiliate Anthony Townsend writes about city charter reform.

The civic tech gang for some reason — probably because it is their target and they are nibbling off what they think they can actually achieve in the short run — hasn’t really articulated the fault lines in its interactions with city governments. When I listen to those exchanges it seems a little too cozy, as if the civic tech players are just waiting to be brought into government to drive the change from within.


In this blogpost, Jade E. Davis argues against the myth of bootstrapping in education equality.

“Those who are imagined as less American might see a single generational gain with successive generations seeing some benefit. But overall, educational attainment does not free a person from the larger cultural forces that shape and limit their experiences as potential. Additionally, the assumptions of identity color who we imagine being capable of success globally.”


D&S artist-in-residence Ingrid Burrington was interviewed for The Intercept about her book and took a walking tour with her interviewer, Cora Currier.

I asked Burrington what she hoped people would do with her guide. It is empowering to know what you’re looking at, but also overwhelming to consider the scale of the apparatus around you. Burrington described a public records battle she lost to get the locations of NYPD cameras; the city said the data could help criminals. In the process, Burrington realized that the data she was seeking wouldn’t account for unmarked cameras and privately owned cameras that could be turned over to police. To map the entire surveillance network of a city would require a huge effort and become quickly outdated.


D&S affiliate Surya Mattu with Julia Angwin examine how Amazon’s shopping algorithm directs customers to buy Amazon or Amazon-affiliated sellers’ merchandise, even if products from other sellers on the platform cost much less.

Through its rankings and algorithm, Amazon is quietly reshaping online commerce almost as dramatically as it reshaped offline commerce when it burst onto the scene more than 20 years ago. Just as the company’s cheap prices and fast shipping caused a seismic shift in retailing that shuttered stores selling books, electronics and music, now Amazon’s pay-to-play culture is forcing online sellers to choose between paying hefty fees or leaving the platform altogether.


D&S founder danah boyd critiques traditional media’s response to the September 17th bombings in NYC.

Traditional news media has a lot of say in what it publishes. This is one of the major things that distinguishes it from social media, which propagates the fears and anxieties of the public. And yet, time and time again, news media shows itself to be irresponsible, motivated more by the attention and money that it can obtain by stoking people’s fears than by a moral responsibility to help ground an anxious public.


In this blogpost, Audrey Watters challenges accountability processes in the current public school system.

“So when we think about “what counts” and who’s held to account under public education’s accountability regime, it’s still worth asking if accountability can co-exist with “response-ability” — accountable to whom, how and to what ends; responsible to whom, how, and to what ends.”


teaching | 09.19.16

Supporting Ethics in Data Research

Emily Keller, Bonnie Tijerina, danah boyd
Background:

University campuses provide an ecosystem of support to technical researchers, including computer scientists, as they navigate emerging issues of privacy, ethics, security, and consent in big data research. These support systems have varying levels of coordination and may be implicit or explicit.

As part of the Supporting Ethics in Data Research project at Data & Society, we held workshops with twelve to sixteen student researchers,professors, information technology leaders, repository managers, and research librarians at a handful of universities. The goal was to tease out the individual components of ethical, technical, and legal support that were available or absent on each campus, and to better understand the interactions between different actors as they encounter common ethical quandaries.

Materials: sticky notes, scratch paper, pens, and markers
Downloads: Supporting_Ethics_Materials_Sept2016.zip
Exercises:

Case Study of a Technical Researcher: provides a fictional scenario involving a researcher who needs assistance navigating a number of obstacles during her technical research.

Data Clinic Model: facilitates a brainstorming session about the components needed for a drop-in clinic to offer peer and professional support.

Ethics Conversation: asks participants to link words, feelings, and thoughts to the word “ethics,” followed by a discussion.

Read More: For the results of this project, please see the final report, Supporting Ethical Data Research: An Exploratory Study of Emerging Issues in Big Data and Technical Research, which provides detailed findings.

In the era of big data, how do researchers ethically collect, analyze, and store data? danah boyd, Emily F. Keller, and Bonnie Tijerina explore this question and examine issues from how to achieve informed consent from research subjects in big data research to how to store data securely in case of breaches. The primer evolves into a discussion on how libraries can collaborate with computer scientists to examine ethical big data research issues.


The Engine Room | 09.19.16

Responsible Data in Agriculture

Lindsay Ferris, Zara Rahman

D&S fellow Zara Rahman with Lindsay Ferris wrote an analysis discussing how data is used in agriculture and conclude with how to use said data responsibly.

The responsibility for addressing this does not lie solely with the smaller players in the sector, though. Practising responsible data approaches should be a key concern and policy of the larger actors, from Ministries of Agriculture to companies gathering and dealing with large amounts of data on the sector. Developing policies to proactively identify and address these issues will be an important step to making sure data-driven insights can benefit everyone in the sector.


paper | 09.16.16

The Wisdom of the Captured

Alex Rosenblat, Tim Hwang

D&S researchers Alex Rosenblat and Tim Hwang analyze how widely captured data of technologies, which enable these technologies to make intelligent decisions, may negatively impact users.

More broadly, how might the power dynamics of user and platform interact with the marketing surrounding these technologies to produce outcomes which are perceived as deceptive or unfair? This provocation paper assembles a set of questions on the capacity for machine learning practices to create undisclosed violations of the expectations of users – expectations often created by the platform itself — when applied to public-facing network services. It draws on examples from consumer-facing services, namely GPS navigation services like Google Maps or Waze, and on the experiences of Uber drivers, in an employment context, to explore user assumptions about personalization in crowd-sourced, networked services.


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.