featured


filtered by: our work


report | 10.17.18

Weaponizing the Digital Influence Machine

Anthony Nadler, Matthew Crain, and Joan Donovan

Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech identifies the technologies, conditions, and tactics that enable today’s digital advertising infrastructure to be weaponized by political and anti-democratic actors.


The New York Times | 10.12.18

When Your Boss Is an Algorithm

Alex Rosenblat

Building off the research for her book Uberland: How Algorithms are Rewriting the Rules of Work, Data & Society Researcher Alex Rosenblat explains algorithmic management in the gig-economy.

“Data and algorithms are presented as objective, neutral, even benevolent: Algorithms gave us super-convenient food delivery services and personalized movie recommendations. But Uber and other ride-hailing apps have taken the way Silicon Valley uses algorithms and applied it to work, and that’s not always a good thing.”


In Governing Artificial Intelligence: Upholding Human Rights & Dignity, Mark Latonero shows how human rights can serve as a “North Star” to guide the development and governance of artificial intelligence.

The report draws the connections between AI and human rights; reframes recent AI-related controversies through a human rights lens; and reviews current stakeholder efforts at the intersection of AI and human rights.


report | 09.18.18

Alternative Influence

Rebecca Lewis

Alternative Influence: Broadcasting the Reactionary Right on YouTube presents data from approximately 65 political influencers across 81 channels to identify the “Alternative Influence Network (AIN)”; an alternative media system that adopts the techniques of brand influencers to build audiences and “sell” them political ideology.


On September 13th, Data & Society Founder and President danah boyd gave the keynote speech at the Online News Association Conference. Read the transcript of her talk on Points.

“Now, more than ever, we need a press driven by ideals determined to amplify what is most important for enabling an informed citizenry.”


points | 09.10.18

Technology’s Impact on Infrastructure is a Health Concern

Mikaela Pitcan, Alex Rosenblat, Mary Madden, Kadija Ferryman

Increasingly, technology’s impact on infrastructure is becoming a health concern. In this Points piece, Data & Society Researchers Mikaela Pitcan, Alex Rosenblat, Mary Madden, and Kadija Ferryman tease out why this intersection warrants further research.

“However, there is an urgent need to understand the interdependencies between technology, infrastructure and health, and how these relationships affect Americans’ ability to live the healthiest lives possible. How can we support design, decision-making, and governance of our infrastructures in order to ensure more equitable health outcomes for all Americans?”


2017-2018 Fellow Jeanna Matthews and Research Analyst Kinjal Dave respond to Deji Olukotun’s story about an algorithmic tennis match.

 “The answer can’t be derived from the past alone: It depends on what we collectively decide about the future, about what justice looks like, about leveling the playing field in sports and in life. As in Olukotun’s story, humans and computers will be working together to pick winners and losers. We need to collectively decide on and enforce the rules they will follow. We need the ability to understand, challenge, and audit the decisions. A level playing field won’t be the future unless we insist on it.”


Drawing on conclusions from the Data & Society report Beyond Disruption, Researcher Alexandra Mateescu discusses surveillance of domestic care workers online.

“Online marketplaces may not be the root cause of individual employers’ biases, but their design is not neutral. They are built with a particular archetype of what an “entrepreneurial” domestic worker looks like—one who feels at home in the world of apps, social media, and online self-branding—and ultimately replicates and can even exacerbate many of the divisions that came with our predigital workplaces. As platform companies gain growing power over the hiring processes of a whole industry, they will need to actively work against the embedded inequalities in the markets they now mediate.”


This report by Data & Society Researcher Bonnie Tijerina and Michael Zimmer is the culmination of gatherings that brought together different privacy practitioners to discuss digital privacy for libraries.

“While the recent surge in privacy-related activities within the library community is welcome, we see a gap in the conversations we are having about privacy and our digital presence – a knowledge gap, a lack of shared vocabulary, disparate skill sets, and varied understanding. This gap prevents inclusion across the profession and lacks clarity for those responsible for building tools and licensing products.”


conference | 08.18.18

Future Perfect 2018

Curated by Ingrid Burrington

On Friday, June 8, the second-annual Future Perfect gathering at Data & Society brought together individuals from a variety of world-building disciplines—from art and fiction to architecture and science—to explore the uses, abuses, and paradoxes of speculative futures.


How can we trace the spread of disinformation by tracking metadata? Data & Society Research Affiliate Amelia Acker explains.

“One way of more fully understanding the data craftwork of disinformation on social media platforms is by reading the metadata just as closely as the algorithms do.”


In an op-ed for The New York Times, Data & Society Researcher Alex Rosenblat shatters the narrative that Uber encapsulates the entire gig-economy.

“But this industry has, until recently, operated largely informally, with jobs secured by word-of-mouth. That’s changing, as employers are increasingly turning to Uber-like services to find nannies, housecleaners and other care workers. These new gig economy companies, while making it easier for some people to find short-term work, have created hardships for others, and may leave many experienced care workers behind.”


points | 06.27.18

5 Star Service: A curated reading list

Alexandra Mateescu, Julia Ticona

In this reading list, Data & Society Researcher Alexandra Mateescu and Postdoctoral Scholar Julia Ticona provide a pathway for deeper investigations into themes such as gender inequality and algorithmic visibility in the gig economy.

“This list is meant for readers of Beyond Disruption who want to dig more deeply into some of the key areas explored in its pages. It isn’t meant to be exhaustive, but rather give readers a jumping off point for their own investigations.”


report | 06.27.18

Beyond Disruption

Julia Ticona, Alexandra Mateescu, Alex Rosenblat

Drawn from the experiences of U.S. ridehail, care, and cleaning platform workers, “Beyond Disruption” demonstrates how technology reshapes the future of labor.


Data & Society INFRA Lead Ingrid Burrington traces the history of Silicon Valley and its residents.

“Now San Jose has an opportunity to lift up these workers placed at the bottom of the tech industry as much as the wealthy heroes at its top. If Google makes good on the “deep listening” it has promised, and if San Jose residents continue to challenge the company’s vague promises, the Diridon project might stand a chance of putting forth a genuinely visionary alternative to the current way of life in the Santa Clara Valley and the founder-centric, organized-labor-allergic ideology of Silicon Valley. If it does, San Jose might yet justify its claim to be the center of Silicon Valley—if not as its capital, at least as its heart.”


For Points, Data & Society Postdoctoral Scholar Caroline Jack reviews the history of advertising imaginaries.

“The question of what protections ads themselves deserve, and to what degree people deserve to be protected from ads, is ripe for reconsideration.”


In this Medium post, Founder and President danah boyd reflects on the current state of journalism and offers next steps.

“Contemporary propaganda isn’t about convincing someone to believe something, but convincing them to doubt what they think they know.”


Data & Society Research Analyst Melanie Penagos summarizes three blogposts that came as a result of Data & Society’s AI & Human Rights Workshop in April 2018.

“Following Data & Society’s AI & Human Rights Workshop in April, several participants continued to reflect on the convening and comment on the key issues that were discussed. The following is a summary of articles written by workshop attendees Bendert Zevenbergen, Elizabeth Eagen, and Aubra Anthony.”


How will the introduction of AI into the field of medicine affect the doctor-patient relationship? Data & Society Fellow Claudia Haupt identifies some legal questions we should be asking.

“I contend that AI will not entirely replace human doctors (for now) due to unresolved issues in transposing diagnostics to a non-human context, including both limits on the technical capability of existing AI and open questions regarding legal frameworks such as professional duty and informed consent.”


How do people decide what to trust? Data & Society Postdoctoral Scholar Francesca Tripodi shares insights from her research into conservative news practices.

“While not all Christians are conservative nor all conservatives religious, there is a clear connection between how the process of scriptural inference trickles down into conservative methods of inquiry. Favoring the original text of the Constitution is closely tied to the practices of ‘constitutional conservatism,’ and currently members in all three branches of the U.S. government rely on practices of scriptural inference to make important political decisions.”



The Guardian | 06.01.18

The Case for Quarantining Extremist Ideas

danah boyd, Joan Donovan

Data & Society President and Founder danah boyd and Media Manipulation Research Lead Joan Donovan challenge newsrooms to practice “strategic silence” to avoid amplifying extremist messaging.

“Editors used to engage in strategic silence – set agendas, omit extremist ideas and manage voices – without knowing they were doing so. Yet the online context has enhanced extremists’ abilities to create controversies, prompting newsrooms to justify covering their spectacles. Because competition for audience is increasingly fierce and financially consequential, longstanding newsroom norms have come undone. We believe that journalists do not rebuild reputation through a race to the bottom. Rather, we think that it’s imperative that newsrooms actively take the high ground and re-embrace strategic silence in order to defy extremists’ platforms for spreading hate.”


magazine article | 05.24.18

Effortless Slippage

Ingrid Burrington

In e-flux, Data & Society INFRA Lead Ingrid Burrington contemplates the maps of the internet.

“The historical maps made of the internet—and, later, the maps of the world made by the internet—are both reflection and instrument of the ideologies and entanglements of the networked world. They are one way we might navigate the premise of the networked citizen and her obligations to her fellow travelers in the networked landscape.”


report | 05.22.18

The Oxygen of Amplification

Whitney Phillips

The Oxygen of Amplification draws on in-depth interviews by scholar Whitney Phillips to showcase how news media was hijacked from 2016-2018 to amplify the messages of hate groups.

Offering extremely candid comments from mainstream journalists, the report provides a snapshot of an industry caught between the pressure to deliver page views, the impulse to cover manipulators and “trolls,” and the disgust (expressed in interviewees’ own words) of accidentally propagating extremist ideology.


report | 05.16.18

Searching for Alternative Facts

Francesca Tripodi

Searching for Alternative Facts is an ethnographic account drawn directly from Dr. Francesca Tripodi’s research within upper-middle class conservative Christian* communities in Virginia in 2017. Dr. Tripodi uses Christian practices of Biblical interpretation as a lens for understanding the relationship between so-called “alternative” or “fake news” sources and contemporary conservative political thought.


Data & Society Postdoctoral Scholar Julia Ticona and Research Analyst Alexandra Mateescu investigate the consequences of “visibility” in carework apps.

“Based on a discourse analysis of carework platforms and interviews with workers using them, we illustrate that these platforms seek to formalize employment relationships through technologies that increase visibility. We argue that carework platforms are “cultural entrepreneurs” that create and maintain cultural distinctions between populations of workers, and institutionalize those distinctions into platform features. Ultimately, the visibility created by platforms does not realize the formalization of employment relationships, but does serve the interests of platform companies and clients and exacerbate existing inequalities for workers.”


On April 26-27, Data & Society hosted a multidisciplinary workshop on AI and Human Rights. In this Points piece, Data + Human Rights Research Lead Mark Latonero and Research Analyst Melanie Penagos summarize discussions from the day.

“Can the international human rights framework effectively inform, shape, and govern AI research, development, and deployment?”


Search plays a unique role in modern online information systems.

Unlike with social media, where users primarily consume algorithmically curated feeds of information, the typical approach to a search engine begins with a query or question in an effort to seek new information.

However, not all search queries are equal. There are many search terms for which the available relevant data is limited,  non-existent, or deeply problematic.

We call these “data voids.”

Data Voids: Where Missing Data Can Easily Be Exploited explores different types of data voids; the challenges that search engines face when they encounter queries over spaces where data voids exist; and the ways data voids can be exploited by those with ideological, economic, or political agendas.

Authors

Michael Golebiewski, Microsoft Bing

danah boyd, Microsoft Research and Data & Society


3sat TV interviewed Data & Society Founder and President danah boyd at re:publica 2018 about gaining control of our data privacy. The video is in German.

 


points | 04.25.18

Proceed With Caution

Kadija Ferryman, Elaine O. Nsoesie

After the Cambridge Analytica scandal, can internet data be used ethically for research? Data & Society Postdoctoral Scholar Kadija Ferryman and Elaine O. Nsoesie, PhD from the Institute for Health Metrics and Evaluation recommend “proceeding with caution” when it comes to internet data and precision medicine.

“Despite the public attention and backlash stemming from the Cambridge Analytica scandal — which began with an academic inquiry and resulted in at least 87 million Facebook profiles being disclosed — researchers argue that Facebook and other social media data can be used to advance knowledge, as long as these data are accessed and used in a responsible way. We argue that data from internet-based applications can be a relevant resource for precision medicine studies, provided that these data are accessed and used with care and caution.”


For the book “New Technologies for Human Rights Law and Practice,” Data & Society Researcher Mark Latonero raises privacy concerns when big data analytics are used in a human rights context.

“This chapter argues that the use of big data analytics in human rights work creates inherent risks and tensions around privacy. The techniques that comprise big data collection and analysis can be applied without the knowledge, consent, or understanding of data subjects. Thus, the use of big data analytics to advance or protect human rights risks violating privacy rights and norms and may lead to individual harms. Indeed, data analytics in the human rights monitoring context has the potential to produce the same ethical dilemmas and anxieties as inappropriate state or corporate surveillance. Therefore, its use may be difficult to justify without sufficient safeguards. The chapter concludes with a call to develop guidelines for the use of big data analytics in human rights that can help preserve the integrity of human rights monitoring and advocacy.”


primer | 04.18.18

Algorithmic Accountability: A Primer

Robyn Caplan, Joan Donovan, Lauren Hanson, and Jeanna Matthews

Algorithmic Accountability examines the process of assigning responsibility for harm when algorithmic decision-making results in discriminatory and inequitable outcomes.

The primer–originally prepared for the Progressive Congressional Caucus’ Tech Algorithm Briefing–explores the trade-offs debates about algorithms and accountability across several key ethical dimensions, including fairness and bias; opacity and transparency; and lack of standards for auditing.


For Data & Society Points, Visiting Scholar Anne Washington breaks down the numbers behind Facebook and Cambridge Analytica.

“How did the choices made by only 270,000 Facebook users affect millions of people? How is it possible that the estimate of those affected changed from 50 million to 87 million so quickly? As a professor of data policy, I am interested in how information flows within organizations. In the case of Facebook and Cambridge Analytica, I was curious why this number was so inexact.”


For Slate, Data & Society Researcher Jacob Metcalf argues that we should be more concerned about behavioral models developed by entities like Cambridge Analytica, which can be traded between political entities, rather than the voter data itself.

” In other words, the one thing we can be sure of psychographic profiling is that it provided one more way to transfer knowledge and economic value between campaigns and organizations.”


In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.

“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”


In this study, Data & Society Founder and President danah boyd, Affiliate Alice Marwick, and Researcher Mikaela Pitcan interviewed ask, how do young people of low socio-economic status in NYC manage their impressions online using tactics of respectability politics?

“This paper analyzes how upwardly mobile young people of low socio-economic status in New York City manage impressions online by adhering to normative notions of respectability. Our participants described how they present themselves on social media by self-censoring, curating a neutral image, segmenting content by platform, and avoiding content and contacts coded as lower class. Peers who post sexual images, primarily women, were considered unrespectable and subject to sexual shaming. These strategies reinforce racist and sexist notions of appropriate behavior, simultaneously enabling and limiting participants’ ability to succeed. We extend the impression management literature to examine how digital media mediates the intersection of class, gender, and race.”


report | 04.03.18

Refugee Connectivity

Mark Latonero, Danielle Poole, and Jos Berens

Data & Society and the Harvard Humanitarian Initiative’s Refugee Connectivity: A Survey of Mobile Phones, Mental Health, and Privacy at a Syrian Refugee Camp in Greece” provides new evidence of the critical role internet connectivity and mobile devices play in the lives and wellbeing of this population. Findings are based on a survey of 135 adults amongst the 750 residents at Ritsona Refugee Camp in Greece.


Data & Society Postdoctoral Scholar Julia Ticona and Data & Society Research Analyst Alexandra Mateescu co-authored an op-ed for Fast Company about the safety of workers who rely on digital platforms to stay employed.

“For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.”


Data & Human Rights Research Lead Mark Latonero investigates the impact of digitally networked technologies on the safe passage of refugees and migrants.

“…in making their way to safe spaces, refugees rely not only on a physical but increasingly also digital infrastructure of movement. Social media, mobile devices, and similar digitally networked technologies comprise this infrastructure of ‘digital passages’—sociotechnical spaces of flows in which refugees, smugglers, governments, and corporations interact with each other and with new technologies.”


Data & Society Researcher Jacob Metcalf co-authored an op-ed in Slate discussing how giving researchers more access to Facebook users’ data could prevent unethical data mining.

“This case raises numerous complicated ethical and political issues, but as data ethicists, one issue stands out to us: Both Facebook and its users are exposed to the downstream consequences of unethical research practices precisely because like other major platforms, the social network does not proactively facilitate ethical research practices in exchange for access to data that users have consented to share.”


book | 03.01.18

Trump and the Media

Edited by Pablo J. Boczkowski and Zizi Papacharissi

D&S Founder danah boyd and Researcher Robyn Caplan contributed to the book “Trump and the Media,” which examines the role the media played in the election of Donald Trump.

Other contributors include: Mike Ananny, Chris W. Anderson, Rodney Benson, Pablo J. Boczkowski, Michael X. Delli Carpini, Josh Cowls, Susan J. Douglas, Keith N. Hampton, Dave Karpf, Daniel Kreiss, Seth C. Lewis, Zoey Lichtenheld, Andrew L. Mendelson, Gina Neff, Zizi Papacharissi, Katy E. Pearce, Victor Pickard, Sue Robinson, Adrienne Russell, Ralph Schroeder, Michael Schudson, Julia Sonnevend, Keren Tenenboim-Weinblatt, Tina Tucker, Fred Turner, Nikki Usher, Karin Wahl-Jorgensen, Silvio Waisbord, Barbie Zelizer.


Read and/or watch Data & Society Founder and President danah boyd’s keynote talk at SXSW EDU 2018.

“I get that many progressive communities are panicked about conservative media, but we live in a polarized society and I worry about how people judge those they don’t understand or respect. It also seems to me that the narrow version of media literacy that I hear as the “solution” is supposed to magically solve our political divide. It won’t. More importantly, as I’m watching social media and news media get weaponized, I’m deeply concerned that the well-intended interventions I hear people propose will backfire, because I’m fairly certain that the crass versions of critical thinking already have.”


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.