featured


filtered by: our work


magazine article | 05.24.18

Effortless Slippage

Ingrid Burrington

In e-flux, Data & Society INFRA Lead Ingrid Burrington contemplates the maps of the internet.

“The historical maps made of the internet—and, later, the maps of the world made by the internet—are both reflection and instrument of the ideologies and entanglements of the networked world. They are one way we might navigate the premise of the networked citizen and her obligations to her fellow travelers in the networked landscape.”


report | 05.22.18

The Oxygen of Amplification

Whitney Phillips

The Oxygen of Amplification draws on in-depth interviews by scholar Whitney Phillips to showcase how news media was hijacked from 2016-2018 to amplify the messages of hate groups.

Offering extremely candid comments from mainstream journalists, the report provides a snapshot of an industry caught between the pressure to deliver page views, the impulse to cover manipulators and “trolls,” and the disgust (expressed in interviewees’ own words) of accidentally propagating extremist ideology.


report | 05.16.18

Searching for Alternative Facts

Francesca Tripodi

Searching for Alternative Facts is an ethnographic account drawn directly from Dr. Francesca Tripodi’s research within upper-middle class conservative Christian* communities in Virginia in 2017. Dr. Tripodi uses Christian practices of Biblical interpretation as a lens for understanding the relationship between so-called “alternative” or “fake news” sources and contemporary conservative political thought.


Data & Society Postdoctoral Scholar Julia Ticona and Research Analyst Alexandra Mateescu investigate the consequences of “visibility” in carework apps.

“Based on a discourse analysis of carework platforms and interviews with workers using them, we illustrate that these platforms seek to formalize employment relationships through technologies that increase visibility. We argue that carework platforms are “cultural entrepreneurs” that create and maintain cultural distinctions between populations of workers, and institutionalize those distinctions into platform features. Ultimately, the visibility created by platforms does not realize the formalization of employment relationships, but does serve the interests of platform companies and clients and exacerbate existing inequalities for workers.”


On April 26-27, Data & Society hosted a multidisciplinary workshop on AI and Human Rights. In this Points piece, Data + Human Rights Research Lead Mark Latonero and Research Analyst Melanie Penagos summarize discussions from the day.

“Can the international human rights framework effectively inform, shape, and govern AI research, development, and deployment?”


Search plays a unique role in modern online information systems.

Unlike with social media, where users primarily consume algorithmically curated feeds of information, the typical approach to a search engine begins with a query or question in an effort to seek new information.

However, not all search queries are equal. There are many search terms for which the available relevant data is limited,  non-existent, or deeply problematic.

We call these “data voids.”

Data Voids: Where Missing Data Can Easily Be Exploited explores different types of data voids; the challenges that search engines face when they encounter queries over spaces where data voids exist; and the ways data voids can be exploited by those with ideological, economic, or political agendas.

Authors

Michael Golebiewski, Microsoft Bing

danah boyd, Microsoft Research and Data & Society


3sat TV interviewed Data & Society Founder and President danah boyd at re:publica 2018 about gaining control of our data privacy. The video is in German.

 


points | 04.25.18

Proceed With Caution

Kadija Ferryman, Elaine O. Nsoesie

After the Cambridge Analytica scandal, can internet data be used ethically for research? Data & Society Postdoctoral Scholar Kadija Ferryman and Elaine O. Nsoesie, PhD from the Institute for Health Metrics and Evaluation recommend “proceeding with caution” when it comes to internet data and precision medicine.

“Despite the public attention and backlash stemming from the Cambridge Analytica scandal — which began with an academic inquiry and resulted in at least 87 million Facebook profiles being disclosed — researchers argue that Facebook and other social media data can be used to advance knowledge, as long as these data are accessed and used in a responsible way. We argue that data from internet-based applications can be a relevant resource for precision medicine studies, provided that these data are accessed and used with care and caution.”


For the book “New Technologies for Human Rights Law and Practice,” Data & Society Researcher Mark Latonero raises privacy concerns when big data analytics are used in a human rights context.

“This chapter argues that the use of big data analytics in human rights work creates inherent risks and tensions around privacy. The techniques that comprise big data collection and analysis can be applied without the knowledge, consent, or understanding of data subjects. Thus, the use of big data analytics to advance or protect human rights risks violating privacy rights and norms and may lead to individual harms. Indeed, data analytics in the human rights monitoring context has the potential to produce the same ethical dilemmas and anxieties as inappropriate state or corporate surveillance. Therefore, its use may be difficult to justify without sufficient safeguards. The chapter concludes with a call to develop guidelines for the use of big data analytics in human rights that can help preserve the integrity of human rights monitoring and advocacy.”


primer | 04.18.18

Algorithmic Accountability: A Primer

Robyn Caplan, Joan Donovan, Lauren Hanson, and Jeanna Matthews

Algorithmic Accountability examines the process of assigning responsibility for harm when algorithmic decision-making results in discriminatory and inequitable outcomes.

The primer–originally prepared for the Progressive Congressional Caucus’ Tech Algorithm Briefing–explores the trade-offs debates about algorithms and accountability across several key ethical dimensions, including fairness and bias; opacity and transparency; and lack of standards for auditing.


For Data & Society Points, Visiting Scholar Anne Washington breaks down the numbers behind Facebook and Cambridge Analytica.

“How did the choices made by only 270,000 Facebook users affect millions of people? How is it possible that the estimate of those affected changed from 50 million to 87 million so quickly? As a professor of data policy, I am interested in how information flows within organizations. In the case of Facebook and Cambridge Analytica, I was curious why this number was so inexact.”


For Slate, Data & Society Researcher Jacob Metcalf argues that we should be more concerned about behavioral models developed by entities like Cambridge Analytica, which can be traded between political entities, rather than the voter data itself.

” In other words, the one thing we can be sure of psychographic profiling is that it provided one more way to transfer knowledge and economic value between campaigns and organizations.”


In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.

“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”


In this study, Data & Society Founder and President danah boyd, Affiliate Alice Marwick, and Researcher Mikaela Pitcan interviewed ask, how do young people of low socio-economic status in NYC manage their impressions online using tactics of respectability politics?

“This paper analyzes how upwardly mobile young people of low socio-economic status in New York City manage impressions online by adhering to normative notions of respectability. Our participants described how they present themselves on social media by self-censoring, curating a neutral image, segmenting content by platform, and avoiding content and contacts coded as lower class. Peers who post sexual images, primarily women, were considered unrespectable and subject to sexual shaming. These strategies reinforce racist and sexist notions of appropriate behavior, simultaneously enabling and limiting participants’ ability to succeed. We extend the impression management literature to examine how digital media mediates the intersection of class, gender, and race.”


report | 04.03.18

Refugee Connectivity

Mark Latonero, Danielle Poole, and Jos Berens

Data & Society and the Harvard Humanitarian Initiative’s Refugee Connectivity: A Survey of Mobile Phones, Mental Health, and Privacy at a Syrian Refugee Camp in Greece” provides new evidence of the critical role internet connectivity and mobile devices play in the lives and wellbeing of this population. Findings are based on a survey of 135 adults amongst the 750 residents at Ritsona Refugee Camp in Greece.


Data & Society Postdoctoral Scholar Julia Ticona and Data & Society Research Analyst Alexandra Mateescu co-authored an op-ed for Fast Company about the safety of workers who rely on digital platforms to stay employed.

“For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.”


Data & Human Rights Research Lead Mark Latonero investigates the impact of digitally networked technologies on the safe passage of refugees and migrants.

“…in making their way to safe spaces, refugees rely not only on a physical but increasingly also digital infrastructure of movement. Social media, mobile devices, and similar digitally networked technologies comprise this infrastructure of ‘digital passages’—sociotechnical spaces of flows in which refugees, smugglers, governments, and corporations interact with each other and with new technologies.”


Data & Society Researcher Jacob Metcalf co-authored an op-ed in Slate discussing how giving researchers more access to Facebook users’ data could prevent unethical data mining.

“This case raises numerous complicated ethical and political issues, but as data ethicists, one issue stands out to us: Both Facebook and its users are exposed to the downstream consequences of unethical research practices precisely because like other major platforms, the social network does not proactively facilitate ethical research practices in exchange for access to data that users have consented to share.”


book | 03.01.18

Trump and the Media

Edited by Pablo J. Boczkowski and Zizi Papacharissi

D&S Founder danah boyd and Researcher Robyn Caplan contributed to the book “Trump and the Media,” which examines the role the media played in the election of Donald Trump.

Other contributors include: Mike Ananny, Chris W. Anderson, Rodney Benson, Pablo J. Boczkowski, Michael X. Delli Carpini, Josh Cowls, Susan J. Douglas, Keith N. Hampton, Dave Karpf, Daniel Kreiss, Seth C. Lewis, Zoey Lichtenheld, Andrew L. Mendelson, Gina Neff, Zizi Papacharissi, Katy E. Pearce, Victor Pickard, Sue Robinson, Adrienne Russell, Ralph Schroeder, Michael Schudson, Julia Sonnevend, Keren Tenenboim-Weinblatt, Tina Tucker, Fred Turner, Nikki Usher, Karin Wahl-Jorgensen, Silvio Waisbord, Barbie Zelizer.


Read and/or watch Data & Society Founder and President danah boyd’s keynote talk at SXSW EDU 2018.

“I get that many progressive communities are panicked about conservative media, but we live in a polarized society and I worry about how people judge those they don’t understand or respect. It also seems to me that the narrow version of media literacy that I hear as the “solution” is supposed to magically solve our political divide. It won’t. More importantly, as I’m watching social media and news media get weaponized, I’m deeply concerned that the well-intended interventions I hear people propose will backfire, because I’m fairly certain that the crass versions of critical thinking already have.”


report | 03.06.18

Spectrum of Trust in Data

Claire Fontaine and Kinjal Dave

New report finds providing school choice data to parents does not equalize educational opportunity, but rather replicates and perpetuates existing inequalities.

Data & Society Researcher Dr. Claire Fontaine and Research Assistant Kinjal Dave performed a qualitative, semi-structured, interview-based study with a socio-economically, racially, and geographically diverse group of 30 New York City parents and guardians between May and November 2017. Interviews focused on experiences of school choice; and data and information sources.


Data & Society Researcher Alex Rosenblat unveils the impact of Uber’s new driving limit policy.

“These moves from Uber and Lyft seem to align with their gig-economy model of employment, which structures work as an individual pursuit and individual liability. But even this sell is misleading. While, for many drivers, the idea of being independent at work is very appealing, their ability to make entrepreneurial decisions is consistently constrained by the ride-hail apps’ nudges and other algorithmic management, rules, external costs, and wage cuts.”


Data & Society INFRA Lead Ingrid Burrington reflects on her visit to Spaceport America.

“It’s a quintessential American desert trope: the future as rehearsal rather than reality. Many promises for technologies of future urbanism start as desert prototypes.”


working paper | 03.02.18

The Intuitive Appeal of Explainable Machines

Andrew Selbst, Solon Barocas

This paper is a response to calls for explainable machines by Data & Society Postdoctoral Scholar Andrew Selbst and Affiliate Solon Barocas.

“We argue that calls for explainable machines have failed to recognize the connection between intuition and evaluation and the limitations of such an approach. A belief in the value of explanation for justification assumes that if only a model is explained, problems will reveal themselves intuitively. Machine learning, however, can uncover relationships that are both non-intuitive and legitimate, frustrating this mode of normative assessment. If justification requires understanding why the model’s rules are what they are, we should seek explanations of the process behind a model’s development and use, not just explanations of the model itself.”


The rollout of Electronic Visit Verification (EVV) for Medicaid recipients has serious privacy implications, argues Data & Society Researcher Jacob Metcalf.

“So why should we be worried about rules that require caregivers to provide an electronic verification of the labor provided to clients? Because without careful controls and ethical design thinking, surveillance of caregiver labor is also functionally surveillance of care recipients, especially when family members are employed as caregivers.”


report | 02.26.18

Fairness in Precision Medicine

Kadija Ferryman and Mikaela Pitcan

Fairness in Precision Medicine is the first report to deeply examine the potential for biased and discriminatory outcomes in the emerging field of precision medicine; “the effort to collect, integrate and analyze multiple sources of data in order to develop individualized insights about health and disease.”


visualization | 02.26.18

Precision Medicine National Actor Map

Kadija Ferryman, Mikaela Pitcan

The Precision Medicine National Actor Map is the first visualization of the three major national precision medicine projects–All of Us Research Program, My Research Legacy, Project Baseline–and the network of institutions connected to them as grantees and sub-grantees.

The map was developed for the Fairness in Precision Medicine initiative at Data & Society.


report | 02.26.18

What is Precision Medicine?

Kadija Ferryman and Mikaela Pitcan

In this Points essay, Data & Society Postdoctoral Scholar Francesca Tripodi explains how communities might define media literacy differently from one another.

“In that moment, I realized that this community of Evangelical Christians were engaged in media literacy, but used a set of reading practices secular thinkers might be unfamiliar with. I’ve seen hundreds of Conservative Evangelicals apply the same critique they use for the Bible, arguably a postmodern method of unpacking a text, to mainstream media — favoring their own research on topics rather than trusting media authorities.”


Education Week interviewed Data & Society Media Manipulation Lead Joan Donovan about misinformation spread after the Parkland shooting.

“The problem with amplified speech online is that something like this crisis-actor narrative gets a lot of reach and attention, then the story becomes about that, and not the shooting or what these students are doing. I would suggest that media only mentions these narratives to say that this is wrong and that students need to be believed.”


report | 02.21.18

The Promises, Challenges, and Futures of Media Literacy

Monica Bulger and Patrick Davison

This report responds to the “fake news” problem by evaluating the successes and failures of recent media literacy efforts while pointing towards next steps for educators, legislators, technologists, and philanthropists.


report | 02.21.18

Dead Reckoning

Robyn Caplan, Lauren Hanson, and Joan Donovan

New Data & Society report clarifies current uses of “fake news” and analyzes four specific strategies for intervention.


Jacobin | 02.20.18

The New Taylorism

Richard Salame

Data & Society Operations Assistant Richard Salame applies taylorism to Amazon’s new wristbands that track workers’ movements.

“Amazon’s peculiar culture notwithstanding, the wristbands in many ways don’t offer anything new, technologically or conceptually. What has changed is workers’ ability to challenge this kind of surveillance.”


Data & Society Postdoctoral Scholar Andrew Selbst argues for regulations in big data policing.

“The way police are adopting and using these technologies means more people of color are arrested, jailed, or physically harmed by police, while the needs of communities being policed are ignored.”


How do algorithms & data-driven tech induce similarity across an industry? Data & Society Researcher Robyn Caplan and Founder & President danah boyd trace Facebook’s impact on news media organizations and journalists.

“This type of analysis sheds light on how organizational contexts are embedded into algorithms, which can then become embedded within other organizational and individual practices. By investigating technical practices as organizational and bureaucratic, discussions about accountability and decision-making can be reframed.”


points | 02.14.18

Health Data Rush

Kadija Ferryman

As data becomes more prevalent in the health world, Data & Society Postdoctoral Scholar Kadija Ferryman urges us to consider how we will regulate its collection and usage.

“As precision medicine rushes on in the US, how can we understand where there might be tensions between fast-paced technological advancement and regulation and oversight? What regulatory problems might emerge? Are our policies and institutions ready to meet these challenges?”


What is the impact of knowing your genetic risk information? Data & Society Researcher Mikaela Pitcan explores the effects for Points.

“As genetic risk and other health data become more widely available, insights from research and early clinical adoption will expand the growing and data-centric field of precision medicine. However, just like previous forms of medical intervention, precision medicine aims to enhance life, decrease risk of disease, improve treatment, and though data plays a big role, the success of the field depends heavily upon clinician and patient interactions.”


The New Inquiry | 02.09.18

Artificial Advancements

Taeyoon Choi

In this essay, D&S Fellow Taeyoon Choi interrogates technology designed for those with disabilities.

“Even with the most advanced technology, disability can not and—sometimes should not—disappear from people. There are disabled people whose relationship with their own bodily functions and psychological capabilities cannot be considered in a linear movement from causation to result, where narratives of technology as cure override the real varieties in people’s needs and conditions and falsely construct binary states—one or the other, abled or disabled—shadowing everything between or outside of those options.”


Increment | 02.02.18

A rare and toxic age

Ingrid Burrington

In the essay, Data & Society INFRA Lead Ingrid Burrington grounds technological development in the environment.

“While the aforementioned narratives are strategic in their own worlds, they tend to maintain the premise that the environmental cost of technology is still orthogonal or an externality to the more diffuse, less obviously material societal implications of living in an Information Age. The politics of a modern world increasingly defined by data mining may only exist because of literal open-pit mining, but the open pit is more often treated as a plot pivot than a natural through-line: Sure, you feel bad about a social media site being creepy, but behold, the hidden environmental devastation wrought by your iPhonedoesn’t that make you feel even worse?”


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.