featured


report | 02.26.18

Fairness in Precision Medicine

Kadija Ferryman and Mikaela Pitcan

Fairness in Precision Medicine is the first report to deeply examine the potential for biased and discriminatory outcomes in the emerging field of precision medicine; “the effort to collect, integrate and analyze multiple sources of data in order to develop individualized insights about health and disease.”


visualization | 02.26.18

Precision Medicine National Actor Map

Kadija Ferryman, Mikaela Pitcan

The Precision Medicine National Actor Map is the first visualization of the three major national precision medicine projects–All of Us Research Program, My Research Legacy, Project Baseline–and the network of institutions connected to them as grantees and sub-grantees.

The map was developed for the Fairness in Precision Medicine initiative at Data & Society.


report | 02.26.18

What is Precision Medicine?

Kadija Ferryman and Mikaela Pitcan

In this Points essay, Data & Society Postdoctoral Scholar Francesca Tripodi explains how communities might define media literacy differently from one another.

“In that moment, I realized that this community of Evangelical Christians were engaged in media literacy, but used a set of reading practices secular thinkers might be unfamiliar with. I’ve seen hundreds of Conservative Evangelicals apply the same critique they use for the Bible, arguably a postmodern method of unpacking a text, to mainstream media — favoring their own research on topics rather than trusting media authorities.”


Education Week interviewed Data & Society Media Manipulation Lead Joan Donovan about misinformation spread after the Parkland shooting.

“The problem with amplified speech online is that something like this crisis-actor narrative gets a lot of reach and attention, then the story becomes about that, and not the shooting or what these students are doing. I would suggest that media only mentions these narratives to say that this is wrong and that students need to be believed.”


report | 02.21.18

The Promises, Challenges, and Futures of Media Literacy

Monica Bulger and Patrick Davison

This report responds to the “fake news” problem by evaluating the successes and failures of recent media literacy efforts while pointing towards next steps for educators, legislators, technologists, and philanthropists.


report | 02.21.18

Dead Reckoning

Robyn Caplan, Lauren Hanson, and Joan Donovan

New Data & Society report clarifies current uses of “fake news” and analyzes four specific strategies for intervention.


Jacobin | 02.20.18

The New Taylorism

Richard Salame

Data & Society Operations Assistant Richard Salame applies taylorism to Amazon’s new wristbands that track workers’ movements.

“Amazon’s peculiar culture notwithstanding, the wristbands in many ways don’t offer anything new, technologically or conceptually. What has changed is workers’ ability to challenge this kind of surveillance.”


Data & Society Postdoctoral Scholar Andrew Selbst argues for regulations in big data policing.

“The way police are adopting and using these technologies means more people of color are arrested, jailed, or physically harmed by police, while the needs of communities being policed are ignored.”


How do algorithms & data-driven tech induce similarity across an industry? Data & Society Researcher Robyn Caplan and Founder & President danah boyd trace Facebook’s impact on news media organizations and journalists.

“This type of analysis sheds light on how organizational contexts are embedded into algorithms, which can then become embedded within other organizational and individual practices. By investigating technical practices as organizational and bureaucratic, discussions about accountability and decision-making can be reframed.”


points | 02.14.18

Health Data Rush

Kadija Ferryman

As data becomes more prevalent in the health world, Data & Society Postdoctoral Scholar Kadija Ferryman urges us to consider how we will regulate its collection and usage.

“As precision medicine rushes on in the US, how can we understand where there might be tensions between fast-paced technological advancement and regulation and oversight? What regulatory problems might emerge? Are our policies and institutions ready to meet these challenges?”


What is the impact of knowing your genetic risk information? Data & Society Researcher Mikaela Pitcan explores the effects for Points.

“As genetic risk and other health data become more widely available, insights from research and early clinical adoption will expand the growing and data-centric field of precision medicine. However, just like previous forms of medical intervention, precision medicine aims to enhance life, decrease risk of disease, improve treatment, and though data plays a big role, the success of the field depends heavily upon clinician and patient interactions.”


The New Inquiry | 02.09.18

Artificial Advancements

Taeyoon Choi

In this essay, D&S Fellow Taeyoon Choi interrogates technology designed for those with disabilities.

“Even with the most advanced technology, disability can not and—sometimes should not—disappear from people. There are disabled people whose relationship with their own bodily functions and psychological capabilities cannot be considered in a linear movement from causation to result, where narratives of technology as cure override the real varieties in people’s needs and conditions and falsely construct binary states—one or the other, abled or disabled—shadowing everything between or outside of those options.”


Increment | 02.02.18

A rare and toxic age

Ingrid Burrington

In the essay, Data & Society INFRA Lead Ingrid Burrington grounds technological development in the environment.

“While the aforementioned narratives are strategic in their own worlds, they tend to maintain the premise that the environmental cost of technology is still orthogonal or an externality to the more diffuse, less obviously material societal implications of living in an Information Age. The politics of a modern world increasingly defined by data mining may only exist because of literal open-pit mining, but the open pit is more often treated as a plot pivot than a natural through-line: Sure, you feel bad about a social media site being creepy, but behold, the hidden environmental devastation wrought by your iPhonedoesn’t that make you feel even worse?”


Data & Society Media Manipulation Lead Joan Donovan investigates the development of InterOccupy, a virtual organization operated by participants in the Occupy Movement.

“InterOccupy took infrastructure building as a political strategy to ensure the movement endured beyond the police raids on the encampments. I conclude that NSMs create virtual organizations when there are routine and insurmountable failures in the communication milieu, where the future of the movement is at stake. My research follows the Occupy Movement ethnographically to understand what happens after the keyword.”


video | 01.24.18

Are Internet Trolls Born or Made?

Amanda Lenhart, Kathryn Zickuhr

What are internet trolls? Above the Noise explains where internet trolls come from in this video and encourages watchers to read Data & Society’s report “Online Harassment, Digital Abuse, and Cyberstalking in America.”


points | 01.17.18

Don’t Call AI “Magic”

Madeleine Clare Elish

Artificial intelligence is being increasingly used across multiple sectors and people often refer to its function as “magic.” In this blogpost, D&S researcher Madeleine Clare Elish points out how there’s nothing magical about AI and reminds us that the human labor involved in making AI systems work is offered rendered invisible.

“From one perspective, this makes sense: Working like magic implies impressive and seamless functionality and the means by which the effect was achieved is hidden from view or even irrelevant. Yet, from another perspective, implying something works like magic focuses attention on the end result, denying an accounting of the means by which that end result was reached.”


D&S founder and president sings praise for Virginia Eubank’s new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.

“This book should be mandatory for anyone who works in social services, government, or the technology sector because it forces you to really think about what algorithmic decision-making tools are doing to our public sector, and the costs that this has on the people that are supposedly being served. It’s also essential reading for taxpayers and voters who need to understand why technology is not the panacea that it’s often purported to be. Or rather, how capitalizing on the benefits of technology will require serious investment and a deep commitment to improving the quality of social services, rather than a tax cut.”


In the gig-economy, management by algorithms means employment relationships grow more remote and distributed across the network. Alex Rosenblat explains how workers navigate this by creating their own forums.

“Online forums aren’t just helping drivers like Cole navigate the challenges of their work, and helping those of us who use and study these platforms grasp those challenges too. They show how as employment relationships grow more remote and distributed across the network, workers can adapt, using technology to forge their own workplace culture.”


At this year’s Coalition for Networked Information (CNI) Fall 2017 Membership Meeting, Bonnie Tijerina spoke about the implications of using data science in libraries.

“Data science approaches may shed light on otherwise hard-to-see problems in the library.”  — D&S researcher Bonnie Tijerina


 

The informational, economic, and political influence of the dominant tech platforms — Google, Facebook, and Amazon in particular — has become a central topic of debate. In this talk, K. Sabeel Rahman argues that these firms are best understood as the core infrastructure of our 21st century economy and public sphere. The infrastructural power of these firms raises a range of policy questions. What exactly about these firms (e.g., their accumulation of data, their gatekeeping functions, their control over vital public and economic functions like retail delivery or online speech) is “infrastructural?” How should these infrastructural functions be governed and regulated, in light of both their economic and political influence?

Professor Rahman sketches some tentative answers to these questions, drawing on the intellectual history of early 20th century “public utility regulation,” where reformers developed a compelling approach to diagnosing and remedying the problem of private power over the essential infrastructure of the industrial economy, from railroads to finance.

This history suggests some design principles and opens up some novel implications for addressing the problem of platform power in the digital economy. The talk explores more contemporary analogies and applications in the context of our current debates over informational platforms, big data, AI, and algorithms, in order to sketch out some principles for what a public utility-style regulatory approach to Internet platforms would look like.

K. Sabeel Rahman is a Visiting Professor of Law at Harvard Law School, an Assistant Professor of Law at Brooklyn Law School, and a Fellow at the Roosevelt Institute. Rahman earned his AB at Harvard College summa cum laude in Social Studies and returned to Harvard for his JD at Harvard Law School and his PhD in the Harvard Government Department. He also has degrees in Economics and Sociolegal Studies from Oxford, where he was a Rhodes Scholar.


Miranda Katz of WIRED interviews D&S founder and president danah boyd on the evolving public discourse around disinformation and how the tech industry can help rebuild American society.

“It’s actually really clear: How do you reknit society? Society is produced by the social connections that are knit together. The stronger those networks, the stronger the society. We have to make a concerted effort to create social ties, social relationships, social networks in the classic sense that allow for strategic bridges across the polis so that people can see themselves as one.”


Data & Society Researcher Jacob Metcalf argues for an ethical approach to data science and offers strategies for future research.

“On the one hand, it is banally predictable that the consequences of machine-learning-enabled surveillance will fall disproportionately on demographic minorities. On the other hand, queer folks hardly need data scientists scrutinizing their jawlines and hairstyles to warm them about this. They have always known this.”


On Wednesday Nov. 29th, the Supreme Court heard Carpenter vs. U.S., a 4th amendment case on cell data access. Postdoctoral scholars Julia Ticona & Andrew Selbst urged the court to understand that cell phones aren’t voluntary in this day and age.

“The justices will surely understand that without any alternatives for accessing online services, vulnerable (and over-policed) populations will be unable to make meaningful choices to protect their privacy, amplifying the disadvantages they already face.”


The Daily 360 interviews D&S INFRA Lead Ingrid Burrington about how we can see the internet in everyday life.

“First place to look if you’re looking for the internet on a city street is down because a lot of it runs through fiberoptic cables that are buried under the street.”


Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.

Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”


This is a transcript of Data & Society founder and president danah boyd’s recent lightening talk at The People’s Disruption: Platform Co-Ops for Global Challenges.

“But as many of you know, power corrupts. And the same geek masculinities that were once rejuvenating have spiraled out of control. Today, we’re watching as diversity becomes a wedge issue that can be used to radicalize disaffected young men in tech. The gendered nature of tech is getting ugly.”


D&S research Claire Fontaine explores issues around accessibility at NYC schools.

“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.

But for some families, the choice process begins and ends with the question: Is the building fully accessible?”


D&S researcher Monica Bulger co-authored an article on the way children are engaging with technology nationally.

“Beyond revealing pressing and sizeable gaps in knowledge, this cross-national review also reveals the importance of understanding local values and practices regarding the use of technologies. This leads us to stress that future researchers must take into account local contexts and existing inequalities and must share best practices internationally so that children can navigate the balance between risks and opportunities.”

 


D&S founder and President danah boyd & affiliate Solon Barocas investigate the practice of ethics in data science.

“Critical commentary on data science has converged on a worrisome idea: that data scientists do not recognize their power and, thus, wield it carelessly. These criticisms channel legitimate concerns about data science into doubts about the ethical awareness of its practitioners. For these critics, carelessness and indifference explains much of the problem—to which only they can offer a solution.”


Columbia Law Review | 11.01.17

The Taking Economy: Uber, Information, And Power

Alex Rosenblat, Ryan Calo

D&S researcher Alex Rosenblat co-authored an article on power dynamics in the sharing economy.

“Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value but raises concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power.”


D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.

“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”


D&S researcher Mary Madden shared her findings from her recent report “Privacy, Security and Digital Inequality.”

“Not only do Americans with lower levels of income and education have fewer technology resources, but they also express heightened sensitivities about a range of data privacy concerns.”


D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.

“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”


paper | 10.01.17

Calculated Risks

Ingrid Burrington

D&S INFRA Lead Ingrid Burrington reflects on the concept of the future in anticipation of the upcoming show Futureproof that she curated at Haverford College.

“Trying to build alternative futures is often a process of facing that haunting spectre: finding life or potential by invoking and living with the ghosts and weird spirits of a world that could have been. Often, the interface for visiting these particular ghosts isn’t the medium or Ouija board but the archive, which is partly why so many of the works in Futureproof take on an archivist, museological tone. The alternative archive is historical evidence of a shift in the timeline, its own kind of proof that another timeline is not only possible, but has already happened, is already happening and emergent before us.”


On September 27th, D&S Fellow Taeyoon Choi released the first two chapters of his online-book “Poetic Computation: Reader,” which looks at code as a form of poetry as well as the ethics behind it. As an online-book, readers have the unique experience of customizing the design elements of the text to their preferred standards as they read.

Choi is co-founder of The School for Poetic Computation based in New York City, and the book is based off of two of his lectures from the curriculum. The following chapters will be published later this year.

 


In late August, D&S Researcher Anne Washington talked real-world implications of AI with the Centre for Public Impact’s Joel Tito.

Who wants to talk about the end of the world? CPI’s Joel Tito and Data & Society’s Anne Washington certainly do – and it’s this discussion point which kicks off our latest podcast.

Washington, a digital government scholar whose work addresses emerging policy needs for data science, tells Joel about what she is most afraid of when it comes to artificial intelligence (AI) and its application to government. She also explains that while AI can make processes more efficient and better streamlined, it shouldn’t be used for “really complicated human decisions”. Find out why here, as well as if she thinks we should seek inspiration from the Romans or ancient Greeks when it comes to AI and government…


In this blogpost, Zachary Gold dives into the implications of the Children’s Internet Protection Act (CIPA), launched in 2000.

“Many parents certainly worry about their children getting access to inappropriate material online, and CIPA may have been a reasonable way to address that concern when it was passed. The devices we use, and the way we use the internet, have changed drastically since then. Updating CIPA, or replacing it to govern these new devices and connections being used by students could do more harm than good. Keeping pornography out of student’s schoolrooms is important, but filtering and monitoring student’s internet activity around town and at home blurs the role of school administrators.”


“Privacy, Security, and Digital Inequality” by Mary Madden is the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.

Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.

In light of the September 18th announcement by the U.S. Department of Homeland Security[1] about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.

“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.[2]

“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”

In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.

“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”

Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”

“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.



[1] Full text here.
[2] The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.


Additional Resources

For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:

Center for Media Justice – Resource Library
Equality Labs
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties UnionPrivacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Public Knowledge
Rad.Cat (link goes to resources)
Southern Poverty Law Center


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.