featured


“Privacy, Security, and Digital Inequality” by Mary Madden is the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.

Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.

In light of the September 18th announcement by the U.S. Department of Homeland Security[1] about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.

“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.[2]

“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”

In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.

“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”

Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”

“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.



[1] Full text here.
[2] The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.


Additional Resources

For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:

Center for Media Justice – Resource Library
Equality Labs
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties UnionPrivacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Public Knowledge
Rad.Cat (link goes to resources)
Southern Poverty Law Center


D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.

Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.

‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’

Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.


D&S Researcher Alex Rosenblat was interviewed about Uber for Klint Finley’s article in Wired

Tuesday’s agreement may not be the end of Uber’s problems with the FTC either. Hartzog says a recent paper by University of Washington law professor Ryan Calo and multidisciplinary researcher Alex Rosenblat of the research institute Data & Society points to other potential privacy concerns, such as monitoring how much battery power remains on a user’s device, because users with little juice might be willing to pay more for a ride.

‘When a company can design an environment from scratch, track consumer behavior in that environment, and change the conditions throughout that environment based on what the firm observes, the possibilities to manipulate are legion,’ Calo and Rosenblat write. ‘Companies can reach consumers at their most vulnerable, nudge them into overconsumption, and charge each consumer the most he or she may be willing to pay.’


Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:

Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.

That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.


Teaching Tolerance | 08.17.14

What is the Alt-Right?

Becca Lewis

D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance

‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’

The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.


testimony | 08.11.17

Data & Society, Fifteen Scholars File Amicus Brief in Pending SCOTUS Case

Marcia Hoffman, Kendra Albert, Andrew D Selbst

On August 11, 2017, Data & Society and fifteen individual scholars—including danah boyd, Julia Ticona, and Amanda Lenhart—filed an amicus brief in a pending U.S. Supreme Court case, Carpenter v. United States. The parties were represented by Andrew Selbst of Data & Society, and Marcia Hofmann and Kendra Albert of Zeitgeist Law.

The case implicates the Fourth Amendment’s “third party doctrine,” which states that that people who “voluntarily convey” information to third parties do not have reasonable expectation of privacy. As a result, when police obtain records from a third party, it does not currently implicate Fourth Amendment rights.

Timothy Carpenter was convicted for a string of armed robberies based on cell site location data which placed him in proximity of the armed robberies he was accused of partaking in. The case concerns the legality under the Fourth Amendment of the warrantless search and seizure of Carpenter’s historical cellphone records, which reveal his location and movements over the course of 127 days.

In the brief, we argue that the “third party doctrine” should not apply to cell site location information because cell phones are not meaningfully voluntary in modern society. Cell site location information contains abundant information about people’s lives, and unfettered police access to it poses a threat to privacy rights.

Aided by scholarship and statistics from the Data & Society research team, we provide evidence that the 95% of Americans that have cell phones cannot reasonably be expected to opt out of owning a cell phone to avoid police searches. The research shows that cell phones are:

  1. Necessary to participate in the most basic aspects of social and family life;
  2. Essential public safety infrastructure and personal safety equipment;
  3. Both necessary to find employment, and an important part of workplace infrastructure;
  4. Widely used for commerce and banking;
  5. Key for civic participation;
  6. Key for enabling better health outcomes;
  7. Critical to vulnerable populations; and
  8. Have been recognized as a necessity by the U.S. government in the past.

The case is expected to be heard in the fall of 2017.



points | 07.12.17

Who Cares in the Gig Economy?

Alexandra Mateescu

Data & Society Researcher Alexandra Mateescu maps out the inequalities and power dynamics  within the gig economy.

“As on-demand companies like Handy and online marketplaces like Care.com enter the space of domestic work, a range of questions emerge: what are the risks and challenges of signing up for platform-based work as an immigrant? As a non-native English speaker? How are experiences of work different for individuals with strong professional identities as caregivers or housekeepers, versus more casual workers who may also be finding other kinds of work via Postmates or Uber?”


D&S Researcher Madeleine Clare Elish discusses the implications of biased AI in different contexts.

She said when AI is applied to areas like targeted marketing or customer service, this kind of bias is essentially an inconvenience. Models won’t deliver good results, but at the end of the day, no one gets hurt.

The second type of bias, though, can be more impactful to people. Elish talked about how AI is increasingly seeping into areas like insurance, credit scoring and criminal justice. Here, biases, whether they result from unrepresentative data samples or from unconscious partialities of developers, can have much more severe effects.


D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.

“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”


D&S lawyer-in-residence Rebecca Wexler describes the intersection of automated technologies, trade secrets, and the criminal justice system.

For-profit companies dominate the criminal justice technologies industry and produce computer programs that are widespread throughout the justice system. These automated programs deploy cops, analyze forensic evidence, and assess the risk levels of inmates. But these technological advances may be making the system less fair, and without access to the source code, it’s impossible to hold computers to account.


Washington Monthly | 06.13.17

Code of Silence

Rebecca Wexler

D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.

What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.


D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system

The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.


Ford Foundation blog | 05.30.17

Why you Should Care about Bots if you Care about Social Justice

Wildneida Negrón, Morgan Hargrave

D&S affiliate Wilneida Negrón details the role of bots and automation in activism today.

As everyone from advertisers to political adversaries jockey for attention, they are increasingly using automated technologies and processes to raise their own voices or drown out others. In fact, 62 percent of all Internet traffic is made up of programs acting on their own to analyze information, find vulnerabilities, or spread messages. Up to 48 million of Twitter’s 320 million users are bots, or applications that perform automated tasks. Some bots post beautiful art from museum collections, while some spread abuse and misinformation instead. Automation itself isn’t cutting edge, but the prevalence and sophistication of how automated tools interact with users is.


How do young people of low socio-economic status (SES) view online privacy? D&S fellow Alice Marwick, researcher Claire Fontaine, and president and founder danah boyd examine this question in their study.

” Framing online privacy violations as inevitable and widespread may not only help foster activist anger and strategic resistance but also avoid the victim-blaming narratives of some media literacy efforts. By examining the experiences of these young people, who are often left out of mainstream discussions about privacy, we hope to show how approaches to managing the interplay of on- and offline information flows are related to marginalized social and economic positions.”


D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.

Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.


book | 05.19.17

Protecting Patron Privacy

Bonnie Tijerina, Bobbie Newman

Protecting Patron Privacy, edited by Bobbi Newman and Data & Society Researcher Bonnie Tijerina, suggests strategies for data privacy in libraries.

Although privacy is one of the core tenets of librarianship, technology changes have made it increasingly difficult for libraries to ensure the privacy of their patrons in the 21st century library.

This authoritative LITA Guide offers readers guidance on a wide range of topics, including:
• Foundations of privacy in libraries
• Data collection, retention, use, and protection
• Laws and regulations
• Privacy instruction for patrons and staff
• Contracts with third parties
• Use of in-house and internet tools including social network sites, surveillance video, and RFID


D&S researcher Mark Latonero provides an overview of the role of large tech companies in refugee crises.

While the 40-page brief is filled with arguments in support of immigration, it hardly speaks about refugees, except to note that those seeking protection should be welcomed. Any multinational company with a diverse workforce would be concerned about limits to international hiring and employee travel. But tech companies should also be concerned about the refugee populations that depend on their digital services for safety and survival.


report | 05.15.17

Media Manipulation and Disinformation Online

Alice Marwick and Rebecca Lewis

New Report Reveals Why Media Was Vulnerable to Radicalized Groups Online


D&S affiliate Seeta Peña Gangadharan writes about defending digital rights of library patrons.

If this sounds complicated and scary, that’s because it is. But confronted with this matrix of vulnerabilities, the library—with its longstanding commitment to patron privacy—also offers an impressive plan of action.


D&S affiliate Mimi Onuoha states that discarded and sold hardware often has data still on it.

It’s not just individuals who are lax about removing data, companies around the world are at fault as well. In a 2007 study researchers in Canada obtained 60 secondhand drives that had previously belonged to health care facilities. They were able to recover personal information from 65% of the drives. The data included, in the words of the researchers, “very sensitive mental health information on a large number of people.”


D&S affiliate Desmond Patton breaks down how social media can lead to gun violence in this piece in The Trace.

Social media doesn’t allow for the opportunity to physically de-escalate an argument. Instead, it offers myriad ways to exacerbate a brewing conflict as opposing gangs or crews and friends and family take turns weighing in.


Philip Napoli and D&S researcher Robyn Caplan write on why companies like Google and Facebook insist that they are merely tech companies with no media impact, and why they are wrong for First Monday. Abstract is below:

A common position amongst social media platforms and online content aggregators is their resistance to being characterized as media companies. Rather, companies such as Google, Facebook, and Twitter have regularly insisted that they should be thought of purely as technology companies. This paper critiques the position that these platforms are technology companies rather than media companies, explores the underlying rationales, and considers the political, legal, and policy implications associated with accepting or rejecting this position. As this paper illustrates, this is no mere semantic distinction, given the history of the precise classification of communications technologies and services having profound ramifications for how these technologies and services are considered by policy-makers and the courts.


points | 04.20.17

Driving School Choice

Claire Fontaine

In this Points piece, D&S researcher Claire Fontaine argues that educational marketplaces disproportionately advantage those with surplus time, energy, social capital, and institutional knowledge.

“Schooling is not a service, and it is not a commodity. It is — or should be — the means through which a society as diverse as our own coheres and develops a functional social fabric.”


Harvard Business Review | 04.19.17

Creating Simple Rules for Complex Decisions

Jongbin Jung, Connor Concannon, Ravi Shroff, Sharad Goel, Daniel G. Goldstein

Jongbin Jung, Connor Concannon, D&S fellow Ravi Shroff, Sharad Goel, and Daniel G. Goldstein explore new methods for machine learning in criminal justice.

Simple rules certainly have their advantages, but one might reasonably wonder whether favoring simplicity means sacrificing performance. In many cases the answer, surprisingly, is no. We compared our simple rules to complex machine learning algorithms. In the case of judicial decisions, the risk chart above performed nearly identically to the best statistical risk assessment techniques. Replicating our analysis in 22 varied domains, we found that this phenomenon holds: Simple, transparent decision rules often perform on par with complex, opaque machine learning methods.


D&S researcher Mary Madden was interviewed by the American Press Institute about Madden’s recent Knight Foundation-supported report, “How Youth Navigate the News Landscape.”

However, one of my favorite quotes was from a participant who described a future where news would be delivered by hologram: “I think like it’s going to be little holograms. You’re going to open this thing and a little guy’s going to come out and tell you about stuff.”

Given that some participants said they already found notifications annoying, I’m not sure how successful the little hologram guy would be, but it was clear that the participants fully expected that the news industry would continue to evolve and innovate in creative ways moving forward.


D&S fellow Anne L. Washington discusses her previous research as a digital government scholar and her upcoming work examining US open data policy, funded through a five-year National Science Foundation Early Faculty Research Career grant.

“We use a secret language in academia sometimes,” [Washington] says, laughing. “‘Technology management’ is about how organisations leverage digital assets for strategic business goals. My doctorate is in management information systems.  On the other side of that is ‘informatics’, which comes from the library science tradition. Over centuries, librarians have refined how to store and retrieve knowledge so people can find what they need and walk away smarter. Informatics takes this basic idea and scales it up for massive digital collections.”


Julia Angwin, Jeff Larson, Lauren Kirchner, and Surya Mattu complete the Black Box series with an analysis of premiums and payouts in California, Illinois, Texas and Missouri that shows that some major insurers charge minority neighborhoods as much as 30 percent more than other areas with similar accident costs.

But a first-of-its-kind analysis by ProPublica and Consumer Reports, which examined auto insurance premiums and payouts in California, Illinois, Texas and Missouri, has found that many of the disparities in auto insurance prices between minority and white neighborhoods are wider than differences in risk can explain. In some cases, insurers such as Allstate, Geico and Liberty Mutual were charging premiums that were on average 30 percent higher in zip codes where most residents are minorities than in whiter neighborhoods with similar accident costs.


Rhizome | 04.05.17

To Serve the National Interest

Ingrid Burrington, Josh Begley, Seth Freed Wessler

D&S artist-in-residence Ingrid Burrington, with Josh Begley and Seth Freed Wessler, created an installation on presentation at the Ace Hotel.

Building on Wessler’s journalistic investigation into privately run immigrant-only federal prisons, Burrington and Begley present seventy-five individual lenticular prints of satellite imagery capturing these sites and government documents pertaining to them. Together the images, arranged into three distinct grids, explore and abstract “the terrain of U.S. immigration and carceral policy and the human stories usually conspicuously absent in the aerial perspective.”


Other | 04.03.17

Narcissism, Social Media and Power

Alice Marwick, Miranda Giacomin

D&S fellow Alice Marwick discusses narcissism and the attention economy in this interview with Kinfolk.


paper | 04.02.17

Combatting Police Discrimination in the Age of Big Data

Sharad Goel, Maya Perelman, Ravi Shroff, David Alan Sklansky

Sharad Goel, Maya Perelman, D&S fellow Ravi Shroff, and David Alan Sklansky examine a method that can “reduce the racially disparate impact of pedestrian searches and to increase their effectiveness”. Abstract is below:

The exponential growth of available information about routine police activities offers new opportunities to improve the fairness and effectiveness of police practices. We illustrate the point by showing how a particular kind of calculation made possible by modern, large-scale datasets — determining the likelihood that stopping and frisking a particular pedestrian will result in the discovery of contraband or other evidence of criminal activity — could be used to reduce the racially disparate impact of pedestrian searches and to increase their effectiveness. For tools of this kind to achieve their full potential in improving policing, though, the legal system will need to adapt. One important change would be to understand police tactics such as investigatory stops of pedestrians or motorists as programs, not as isolated occurrences. Beyond that, the judiciary will need to grow more comfortable with statistical proof of discriminatory policing, and the police will need to be more receptive to the assistance that algorithms can provide in reducing bias.


PLOS Computational Biology | 03.30.17

Ten simple rules for responsible big data research

Matthew Zook, Solon Barocas, danah boyd, Kate Crawford, Seeta Peña Gangadharan, Alyssa Goodman, Rachelle Hollander, Barbara A. Koenig, Jacob Metcalf, Arvind Narayanan, Alondra Nelson, Frank Pasquale

Matthew Zook, D&S affiliate Solon Barocas, D&S founder danah boyd, D&S affiliate Kate Crawford, Emily Keller, D&S affiliate Seeta Peña Gangadharan, Alyssa Goldman, Rachelle Hollander, Barbara A. Koenig, D&S researcher Jacob Metcalf, Arvind Narayanan, D&S advisor Alondra Nelson, and Frank Pasquale wrote a paper detailing ten rules for responsible big data research. Introduction is below:

The use of big data research methods has grown tremendously over the past five years in both academia and industry. As the size and complexity of available datasets has grown, so too have the ethical questions raised by big data research. These questions become increasingly urgent as data and research agendas move well beyond those typical of the computational and natural sciences, to more directly address sensitive aspects of human behavior, interaction, and health. The tools of big data research are increasingly woven into our daily lives, including mining digital medical records for scientific and economic insights, mapping relationships via social media, capturing individuals’ speech and action via sensors, tracking movement across space, shaping police and security policy via “predictive policing,” and much more.

The beneficial possibilities for big data in science and industry are tempered by new challenges facing researchers that often lie outside their training and comfort zone. Social scientists now grapple with data structures and cloud computing, while computer scientists must contend with human subject protocols and institutional review boards (IRBs). While the connection between individual datum and actual human beings can appear quite abstract, the scope, scale, and complexity of many forms of big data creates a rich ecosystem in which human participants and their communities are deeply embedded and susceptible to harm. This complexity challenges any normative set of rules and makes devising universal guidelines difficult.

Nevertheless, the need for direction in responsible big data research is evident, and this article provides a set of “ten simple rules” for addressing the complex ethical issues that will inevitably arise. Modeled on PLOS Computational Biology’s ongoing collection of rules, the recommendations we outline involve more nuance than the words “simple” and “rules” suggest. This nuance is inevitably tied to our paper’s starting premise: all big data research on social, medical, psychological, and economic phenomena engages with human subjects, and researchers have the ethical responsibility to minimize potential harm.

The variety in data sources, research topics, and methodological approaches in big data belies a one-size-fits-all checklist; as a result, these rules are less specific than some might hope. Rather, we exhort researchers to recognize the human participants and complex systems contained within their data and make grappling with ethical questions part of their standard workflow. Towards this end, we structure the first five rules around how to reduce the chance of harm resulting from big data research practices; the second five rules focus on ways researchers can contribute to building best practices that fit their disciplinary and methodological approaches. At the core of these rules, we challenge big data researchers who consider their data disentangled from the ability to harm to reexamine their assumptions. The examples in this paper show how often even seemingly innocuous and anonymized data have produced unanticipated ethical questions and detrimental impacts.

This paper is a result of a two-year National Science Foundation (NSF)-funded project that established the Council for Big Data, Ethics, and Society, a group of 20 scholars from a wide range of social, natural, and computational sciences (http://bdes.datasociety.net/). The Council was charged with providing guidance to the NSF on how to best encourage ethical practices in scientific and engineering research, utilizing big data research methods and infrastructures.


D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.

Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.


Washington University Law Review | 03.11.17

Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans

Mary Madden, Michele E. Gilman, Karen Levy, Alice E. Marwick

D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:

This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.

 


NickGrossman.is | 03.08.17

Complicity

Nick Grossman

D&S advisor Nick Grossman discusses trust in the sharing economy.

So, fast forward to our refund situation: now I no longer feel like I have any moral high ground to demand a formal close out — in my mind, I was complicit in the shadiness when I was cool with fooling the apartment building.  How is that any different than agreeing to sidestep the Airbnb platform rules?


D&S advisor Anil Dash discusses how interviews can exclude people from the tech  industry.

When we mimic patterns from tech culture without knowing why we do them, we often take good ideas and turn them into terrible barriers.


Columbia Law Review | 03.07.17

The Taking Economy: Uber, Information, and Power

Ryan Calo, Alex Rosenblat

Ryan Calo and D&S researcher Alex Rosenblat write this analysis of the newly termed ‘taking economy’ of Uber.

Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value and raises a set of concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power. This Article, coauthored by a law professor and a technology ethnographer who studies the ride-hailing community, furnishes such a critique and indicates a path toward a meaningful response.

Commercial firms have long used what they know about consumers to shape their behavior and maximize profits. By virtue of sitting between consumers and providers of services, however, sharing economy firms have a unique capacity to monitor and nudge all participants—including people whose livelihood may depend on the platform. Much activity is hidden away from view, but preliminary evidence suggests that sharing economy firms may already be leveraging their access to information about users and their control over the user experience to mislead, coerce, or otherwise disadvantage sharing economy participants.

This Article argues that consumer protection law, with its longtime emphasis of asymmetries of information and power, is relatively well positioned to address this under-examined aspect of the sharing economy. But the regulatory response to date seems outdated and superficial. To be effective, legal interventions must (1) reflect a deeper understanding of the acts and practices of digital platforms and (2) interrupt the incentives of sharing economy firms to abuse their position.


NickGrossman.is | 03.06.17

Flexing the platform for good

Nick Grossman

D&S advisor Nick Grossman discusses constructive approaches for tech companies to support important issues.

As the white house continues to issues executive orders on issues like immigration that hit tech companies directly, and as issues like transgender rights — that are outside the pocketbook interests but may intersect with a company or community’s values — come up, it feels as though companies are going to continue to be under pressure to take public stands.


D&S advisor Anil Dash discusses Fake Markets that are dominated by few tech companies.

Worse, we’ve lost the ability to discern that a short-term benefit for some users that’s subsidized by an unsustainable investment model will lead to terrible long-term consequences for society. We’re hooked on the temporary infusion of venture capital dollars into vulnerable markets that we know are about to be remade by technological transformation and automation. The only social force empowered to anticipate or prevent these disruptions are policymakers who are often too illiterate to understand how these technologies work, and who too desperately want the halo of appearing to be associated with “high tech”, the secular religion of America.

 


report | 03.01.17

How Youth Navigate the News Landscape

Mary Madden, Amanda Lenhart, Claire Fontaine

D&S researchers Mary Madden, Amanda Lenhart, and Claire Fontaine explore youth news consumption behavior on mobile and social media. It reveals how young people are adapting to a changing media environment to access news they trust. Executive summary is below

In 2017, what it means to “know what’s going on in the world” has become
a hotly contested issue. Years of change and innovations in the journalism
industry have radically transformed the way Americans consume, share and
even produce their own forms of news. At a deeper level, the public’s eroding
trust in journalistic institutions and the rise of a highly politicized networked
digital media environment have underscored the urgent need to understand
how these disruptions might evolve in the future.
As is often the case with technological revolutions, young people are on
the front lines of change. They are deeply immersed in social media and
mobile technologies in their daily lives, and are tasked with navigating an
increasingly malleable media environment. And as researchers seek to
understand the shifting behaviors and attitudes of today’s young news
consumers, it has become increasingly important to reexamine the shifting
boundaries of what counts as “news.” If we want to understand the place that
news holds in young people’s lives, it is imperative that we understand their
language, their conceptual models, and their frames of reference. These are
the kinds of insights that interpretive qualitative research has the potential to
surface.
In June and July of 2016, Knight Foundation commissioned a series of focus
groups with 52 teenagers and young adults from across the United States
to learn more about how young people conceptualize and consume news
in digital spaces—with a focus on understanding the growing influence
of mobile devices, social media and messaging apps. The research team
conducted six exploratory focus groups of about 90 minutes each in three
cities in the United States: Philadelphia, Chicago and Charlotte, North
Carolina. Participants were between the ages of 14 and 24 and included an
even mix of young men and women.


D&S lawyer-in-residence Rebecca Wexler analyzes the unreliability of video authenticating in Slate.

When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.


D&S lawyer-in-residence Rebecca Wexler provides an analysis on trade secrecy in the criminal justice system. Abstract is below:

From policing to evidence to parole, data-driven algorithmic systems and other automated software programs are being adopted throughout the criminal justice system. The developers of these technologies often claim that the details about how the programs work are trade secrets and, as a result, cannot be disclosed in criminal cases. This Article turns to evidence law to examine the conflict between transparency and trade secrecy in the criminal justice system. It is the first comprehensive account of trade secret evidence in criminal cases. I argue that recognizing a trade secrets evidentiary privilege in criminal proceedings is harmful, ahistorical, and unnecessary. Withholding information from the accused because it is a trade secret mischaracterizes due process as a business competition.


Radio NZ | 02.15.17

Uber & Alex Rosenblat

Alex Rosenblat

D&S researcher Alex Rosenblat was interviewed by Radio NZ about Uber and the promises it makes its drivers, i.e. flexible hours and freedom.


D&S artist-in-residence Ingrid Burrington explores the importance of domain names at NamesCon, an annual conference for the domain-names industry.

In addition to being crucial to making the web work, domain names are also a highly political pocket of the web, particularly shaped by the legacy of colonialism. Most of the underlying protocols that make the internet work—including DNS—are encoded in ASCII, which translates bits into letterforms, numbers, and punctuation marks. But ASCII’s letterforms only represent the Latin alphabet, limiting expression in domain names to Western languages (while arguing that a character encoding is an instrument of imperialism sounds bold, so does assuming that “text” is synonymous only with “English”).


D&S lawyer-in-residence Rebecca Wexler testifies about government oversight of forensic science laboratories in the State of New York.

I submit these comments to the Assembly Standing Committee on Codes; the Assembly Standing Committee on Judiciary and the Assembly Standing Committee on Oversight, Analysis and Investigation. Thank you for inviting my testimony on government oversight of forensic science laboratories in the State of New York. As a Resident at The Data and Society Research Institute, my work focuses on issues arising from data and technology in the criminal justice system. I want to draw your attention to trade secrets claims in forensic technologies that threaten criminal defendant’s rights to confront and cross-examine the evidence against them; to compulsory process to obtain evidence in their favor; and to due process.


D&S affiliate Keith Hiatt, Michael Kleinman, and D&S researcher Mark Latonero think critically about the usage of technology as a all-encompassing solution in human rights spaces.

It’s important to acknowledge that, most of the time, the underlying problem human rights organisations are trying to solve isn’t technical. It’s often a bureaucratic, institutional, process or workflow problem, and technology won’t solve it (and might exacerbate it).


paper | 02.02.17

The Legacy of inBloom

Monica Bulger, Patrick McCormick, Mikaela Pitcan

D&S researcher Monica Bulger, with Patrick McCormick and D&S research analyst Mikaela Pitcan, writes this working paper detailing the “Legacy of inBloom”.

Although inBloom closed in 2014, it ignited a public discussion of student data privacy that resulted in the introduction of over 400 pieces of state-level legislation. The fervor over inBloom showed that policies and procedures were not yet where they needed to be for schools to engage in data-informed instruction. Industry members responded with a student data privacy pledge that detailed responsible practice. A strengthened awareness of the need for transparent data practices among nearly all of the involved actors is one of inBloom’s most obvious legacies.

Instead of a large-scale, open source platform that was a multi-state collaboration, the trend in data-driven educational   technologies since inBloom’s closure has been toward closed, proprietary systems, adopted piecemeal. To date, no large-scale educational technology initiative has succeeded in American K-12  schools. This study explores several factors that contributed to the demise of inBloom and a number of important questions: What were the values and plans that drove inBloom to be designed the way it was? What were the concerns and movements that caused inBloom to run into resistance How has the entire inBloom development impacted the future of edtech and student data?


D&S advisor John Palfrey speaks to his students about the President’s executive orders on immigration.

There has been much talk of universities and schools committing to be “sanctuaries” for students. There is merit in this idea but there is also a lot of debate as to what it means, in a legal sense. I would simplify how I see it: I aspire for our school to be a home for our students–a home away from home to be sure–one where our youth from every quarter and from every religion know that they will have every protection we can manage, just as we would offer our own children at home.


D&S affiliate Mimi Onuoha details the process of completely deleting data.

This overwriting process is a bit like painting a wall: If you start with a white wall and paint it red, there’s no way to erase the red. If you want the red gone or the wall returned to how it was, you either destroy the wall or you paint it over, several times, so that it’s white again.


D&S founder danah boyd responds to remarks by the Trump administration stating that their opposition is the media.

And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.


Other | 01.26.17

The Public Data Layer

Nick Grossman

Nick Grossman ponders the importance of and vision for the public data layer.

If we do this right, we can get smarter at policymaking, and design regulatory systems that have both greater effectiveness and lower costs of implementation and compliance.


Real Life Magazine | 01.26.17

Close Calls

Zara Rahman

D&S fellow Zara Rahman writes about how immigrant families use social media and digital technologies.

The consequence is that the home of our deeply personal information has gone from treasured letters stored in a box at our houses, to servers owned by corporate companies that we’ll never see. Those personal notes, the ways of showing our family that we’re happy and content in our new lives, despite what we’ve lost — they live online now. The more you share with that corporation, the stronger those family ties get. There is a third party in these relationships.


D&S researcher Claire Fontaine’s debut on Points, “Doing Screen Time,” resourcefully unwinds the apparent contradiction between anxiety around screen time at home and support for screen time at school: “Each produces and enables the other.” Looking into this dynamic is an occasion for asking, collectively, how we want to live.


Deutsche Welle | 01.25.17

Fake news is a red herring

Ethan Zuckerman

D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.

The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.


Harper’s Bazaar | 01.23.17

Trump: A Resister’s Guide

Kate Crawford

D&S affiliate Kate Crawford writes a letter to Silicon Valley on how to resist Trump.

You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment.

 


The New Inquiry | 01.23.17

Data Streams

Hito Steyerl, Kate Crawford

D&S affiliate Kate Crawford and Hito Steyerl converse about “NSA bros, dataveillance, apex predators, AI, and empathy machines.”


Rhizome | 01.19.17

The Valley and the Predator

Ingrid Burrington

D&S artist-in-residence Ingrid Burrington discusses associating weaponry, like drones, with art.

The impulse to pair a technology associated with automated extralegal killing of American citizens alongside “culture and the arts” is weird, but not entirely surprising—the vantage point of drones affords a particular aesthetic in addition to plausible deniability. The aerial perspective has appealed to artists for as long as it has appealed to generals and kings. That distant, presumed-objective view from nowhere, whether achieved via hot air balloon or low-orbit satellite, suggests a totality, a kind of coherence in defiance of the often-incoherent groundtruth of everyday life. For generals, coherence offers the possibility of tactical advantage. For artists (or at least good artists), it’s something to interrogate and take apart.


D&S advisor Ethan Zuckerman provides a transcript on his recent speech about journalism and civics.

One final thing: we have this tendency in journalism right now to feel very sorry for ourselves. This is a field that we are all enormously proud to be part of. This is a field that is harder and harder to make a living in, and I see more and more news organizations essentially saying, “You’re going to miss us. We’re going away. I just want to warn you.”


report | 01.18.17

Intimate Partner Digital Abuse

Michele Ybarra, Myeshia Price-Feeney, Amanda Lenhart, Kathryn Zickuhr

12% of U.S. internet users who have been in romantic relationships have experienced intimate partner digital abuse…

Digital tools are often an integral part of healthy romantic relationships. Romantic partners frequently use digital tools to connect with each other through text messages, photo-sharing, social media posts, and other online activities. These same digital tools can be used in unhealthy ways, facilitating negative behaviors such as monitoring, unwanted picture sharing, and abusive messages — both within the romantic relationship and after the relationship is over. Better understanding how often intimate partner digital abuse is happening, to whom, and in what ways are critical pieces to understanding the scope of the problem.

This report, part of a series of research reports on digital harassment and abuse, examines the prevalence and impact of intimate partner digital abuse. Findings are based upon the results of a nationally representative survey of 3,002 Americans 15 years of age and older conducted from May 17th through July 31st, 2016. Respondents were surveyed on either their landline or cell phone. Interviews were conducted in either English or Spanish. Findings in this report refer to the 2,810 respondents who have ever been in a romantic relationship.

This report, “Intimate Partner Digital Abuse” (press release), complements earlier reports covering the prevalence of online harassment and abuse more broadly, as well as nonconsensual image sharing.


writeup | 01.17.17

Suppressed Images

Heather Dewey-Hagborg, Chelsea Manning, Shoili Kanungo

D&S artist-in-residence Heather Dewey-Hagborg partnered with Chelsea Manning and Shoili Kanungo to create Suppressed Images, an illustrated series about Dewey-Hagborg and Manning’s collaboration.


D&S advisor Susan Crawford writes about communication infrastructure in rural parts of America.

This year, I’ll be traveling the US talking to people in scrappy communities who are building fiber on their own. They’re fed up with waiting for enormous incumbent communications companies to decide it’s in their corporate interests to invest in 21st-century communications capacity for Americans. These communities have run the numbers and looked at their economic development needs — as well as the possibilities for advanced healthcare, world-class educations, effective governance, energy management, and public safety that publicly controlled wholesale “street grids” of fiber make real — and they’ve come to the conclusion that if they hang back, they’ll become irrelevant.


D&S researcher Monica Bulger, with Patrick Burton, Brian O’Neill, and Elisabeth Staksrud, writes “Where policy and practice collide: Comparing United States, South African and European Union approaches to protecting children online”.

That children have a right to protection when they go online is an internationally well-established principle, upheld in laws that seek to safeguard children from online abuse and exploitation. However, children’s own transgressive behaviour can test the boundaries of this protection regime, creating new dilemmas for lawmakers the world over. This article examines the policy response from both the Global North and South to young people’s online behaviour that may challenge adult conceptions of what is acceptable, within existing legal and policy frameworks. It asks whether the ‘childhood innocence’ implied in much protection discourse is a helpful basis for promoting children’s rights in the digital age. Based on a comparative analysis of the emerging policy trends in Europe, South Africa and the United States, the article assesses the implications for policymakers and child welfare specialists as they attempt to redraw the balance between children’s online safety while supporting their agency as digital citizens.


D&S advisor Anil Dash presents the newest episode of On Being.

Back in November, I got to sit down with the amazing Krista Tippett for a lengthy interview in front of an incredibly warm crowd in Easton, MD. Now, that interview has been edited down and is available as the latest episode of Krista’s hugely popular show, On Being.


D&S affiliate Gideon Litchfield writes a short story called ‘Democracy 3.0’


D&S founder danah boyd’s Points piece was re-published for The Guardian. boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.


D&S affiliate Wilneida Negrón details the ten tech issues that will impact this year.

As we begin a new year and a new political administration takes office in the US, let’s take some time to consider some pressing issues that exist at the nexus of technology and social justice—and think about how we as social justice advocates can address them most effectively. Even amid so many unknowns, we can be certain that these issues are among those that will shape 2017 and the years and decades beyond it. And they will be central to the work of building a free, open, and transparent future.


D&S advisor Claudia Perlich discusses modeling, transparency, and machine learning in a new episode of the Partially Derivative podcast.

“One pitfall I see is that it’s easy from a social science perspective to condemn all data science as evil…but that ultimately doesn’t help advance the situation.”


D&S fellow Zara Rahman details her thoughts on owning one’s success and hard work.

It’s unrealistic and unfair to ignore all that work – to myself, and others. Citing luck and serendipity gives the impression that people in positions of influence will somehow magically find out about you and your interests and reach out to you – they (probably) won’t. It implies that if you’re doing this right, opportunities to work on things you want to be working on will just pop up out of the blue.


Medium | 01.07.17

The Four Freedoms, in 2017

Ethan Zuckerman

D&S advisor Ethan Zuckerman reflects on today’s political atmosphere and FDR’s speech on the four freedoms.

This is a scary moment, a time where it looks like the progress we’ve made around the world might reverse, where we go from a world that’s gotten much bigger to one that shrinks. The good news is that we get to decide how big a world we want to live in. We get to decide how to speak, how to listen and how to stand together against fear.


D&S founder danah boyd begins to sketch out how hacking culture evolved from playful efforts to game the media ecosystem to more complex and politicized projects of social engineering, propaganda, and activism in “Hacking the attention economy”.


In “How do you deal with a problem like ‘fake news?’” D&S researcher Robyn Caplan weighs in on the potential — and pitfalls — of efforts to curb Facebook’s fake news problem.


In “Why America is Self-Segregating,” D&S founder danah boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.


In “Are There Limits to Online Free Speech?” D&S fellow Alice Marwick argues against simplistic binaries pitting free speech against censorship, looking at how the tech industry’s historic commitment to freedom of speech falls short in the face of organized harassment.


In “Did Media Literacy Backfire?” D&S founder danah boyd argues that the thorny problems of fake news and the spread of conspiracy theories have, in part, origins in efforts to educate people against misinformation. At the heart of the problem are deeper cultural divides that we must learn how to confront.


In “What’s Propaganda Got To Do With It?” Caroline Jack notes a resurgence in the popularity of “propaganda” as a judgment-laden label for a vast array of media ranging from fringe conspiracy theories to establishment news institutions. What work is this concept doing in efforts to conceptually navigate the contemporary media environment?


video | 01.04.17

Encrypt the Web

Baratunde Thurston

D&S advisor Baratunde Thurston discusses EFF’s work to encrypt the web and switch every site to https://.


Zararah.net | 01.04.17

2016 in books

Zara Rahman

D&S fellow Zara Rahman details the books she has read and enjoyed in 2016.


D&S artist-in-residence Heather Dewey-Hagborg’s work was profiled in this piece by Do Savannah.

In a related piece, artist Heather Dewey-Hagborg has created facial portraits based on the DNA profile of leftover genetic material found in discarded trash, like chewing gum and cigarette butts. The results are both fascinating and deeply creepy.


D&S advisor Charlton D. McIlwain writes an op-ed about the new white political movement.

We must see the new white narrative for what it really is, an attempt to refocus public attention and political capital away from people of color. Trump and many of those he’s chosen to lead his administration are the new white’s principal ambassadors. They take stock of the last few years as blacks fought against police brutality, Muslims battled religious persecution, and Hispanics defended themselves and their families from mass deportations. These representatives of the new white respond: Your concerns don’t matter as much as working-class folk (white people) for whom America’s promise was designed but has been denied.


D&S fellow Zara Rahman donates part of her Shuttleworth Flash Grant to the Human Rights Data Analysis Group and Global Voices.

So here’s the thing: I don’t think we need only innovative ideas or world-changing projects. We also need trust, communities, and skills. We need to strengthen and support existing infrastructure and communities. I worry that we’ve become far too fixated upon quickly implemented innovation and disruption, and that we’re taking a lot of important things for granted—things we rely upon that, unlike “innovative ideas”, take a lot of time and effort to build.


D&S affiliate Kate Crawford co-wrote, with Mike Ananny, this piece discussing transparency and algorithmic accountability.

Models for understanding and holding systems accountable have long rested upon ideals and logics of transparency. Being able to see a system is sometimes equated with being able to know how it works and govern it—a pattern that recurs in recent work about transparency and computational systems. But can “black boxes’ ever be opened, and if so, would that ever be sufficient? In this article, we critically interrogate the ideal of transparency, trace some of its roots in scientific and sociotechnical epistemological cultures, and present 10 limitations to its application. We specifically focus on the inadequacy of transparency for understanding and governing algorithmic systems and sketch an alternative typology of algorithmic accountability grounded in constructive engagements with the limitations of transparency ideals.


D&S advisor Anil Dash asserts that business leaders have to stand up against proposed abuses and violations from the Trump administration.


report | 12.13.16

Nonconsensual Image Sharing

Amanda Lenhart, Michele Ybarra, Myeshia Price-Feeney

Media coverage of revenge porn largely focuses on instances when celebrities have had private nude or explicit photos or videos made public without their consent, but this experience is not limited to the famous and newsworthy. Roughly 3% of all online Americans have had someone threaten to post nude or nearly nude photos or videos of them online to hurt or embarrass them, and 2% of online Americans have had someone actually post a photo of them online without their permission. Taken together, 4% of internet users—one in 25 online Americans—have either had sensitive images posted without their permission or had someone threaten to post photos of them.

This report, “Nonconsensual Image Sharing” (press release), complements an earlier report covering the prevalence of online harassment and abuse more broadly, as well as a subsequent report on intimate partner digital abuse.


D&S fellow Zara Rahman details the year’s ‘data-driven confusion’ and contends for a responsible data approach, both to practice and comprehension.

We must take a responsible data approach to advocacy – address gaps in literacy proactively, be rigorous in our methods, and maintain credibility, especially on important issues. Nowadays, thanks to the speed and amplification of sources afforded to us via technology, analyses and “facts” will spread faster than before. Understanding the critical limitations of data and information is going to become ever more important in years to come.


D&S advisor Ethan Zuckerman discusses his long-time friendship with a Trump supporter.


D&S advisor Hilary Mason presents her process for discovering changes in markets.


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.