What is the impact of knowing your genetic risk information? Data & Society Researcher Mikaela Pitcan explores the effects for Points.
“As genetic risk and other health data become more widely available, insights from research and early clinical adoption will expand the growing and data-centric field of precision medicine. However, just like previous forms of medical intervention, precision medicine aims to enhance life, decrease risk of disease, improve treatment, and though data plays a big role, the success of the field depends heavily upon clinician and patient interactions.”
In this essay, D&S Fellow Taeyoon Choi interrogates technology designed for those with disabilities.
“Even with the most advanced technology, disability can not and—sometimes should not—disappear from people. There are disabled people whose relationship with their own bodily functions and psychological capabilities cannot be considered in a linear movement from causation to result, where narratives of technology as cure override the real varieties in people’s needs and conditions and falsely construct binary states—one or the other, abled or disabled—shadowing everything between or outside of those options.”
In the essay, Data & Society INFRA Lead Ingrid Burrington grounds technological development in the environment.
“While the aforementioned narratives are strategic in their own worlds, they tend to maintain the premise that the environmental cost of technology is still orthogonal or an externality to the more diffuse, less obviously material societal implications of living in an Information Age. The politics of a modern world increasingly defined by data mining may only exist because of literal open-pit mining, but the open pit is more often treated as a plot pivot than a natural through-line: Sure, you feel bad about a social media site being creepy, but behold, the hidden environmental devastation wrought by your iPhone—doesn’t that make you feel even worse?”
Social Media + Society | 02.01.18
Data & Society Media Manipulation Lead Joan Donovan investigates the development of InterOccupy, a virtual organization operated by participants in the Occupy Movement.
“InterOccupy took infrastructure building as a political strategy to ensure the movement endured beyond the police raids on the encampments. I conclude that NSMs create virtual organizations when there are routine and insurmountable failures in the communication milieu, where the future of the movement is at stake. My research follows the Occupy Movement ethnographically to understand what happens after the keyword.”
What are internet trolls? Above the Noise explains where internet trolls come from in this video and encourages watchers to read Data & Society’s report “Online Harassment, Digital Abuse, and Cyberstalking in America.”
Artificial intelligence is increasingly being used across multiple sectors and people often refer to its function as “magic.” In this blogpost, D&S researcher Madeleine Clare Elish points out how there’s nothing magical about AI and reminds us that the human labor involved in making AI systems work is often rendered invisible.
“From one perspective, this makes sense: Working like magic implies impressive and seamless functionality and the means by which the effect was achieved is hidden from view or even irrelevant. Yet, from another perspective, implying something works like magic focuses attention on the end result, denying an accounting of the means by which that end result was reached.”
D&S founder and president sings praise for Virginia Eubank’s new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.
“This book should be mandatory for anyone who works in social services, government, or the technology sector because it forces you to really think about what algorithmic decision-making tools are doing to our public sector, and the costs that this has on the people that are supposedly being served. It’s also essential reading for taxpayers and voters who need to understand why technology is not the panacea that it’s often purported to be. Or rather, how capitalizing on the benefits of technology will require serious investment and a deep commitment to improving the quality of social services, rather than a tax cut.”
In the gig-economy, management by algorithms means employment relationships grow more remote and distributed across the network. Alex Rosenblat explains how workers navigate this by creating their own forums.
“Online forums aren’t just helping drivers like Cole navigate the challenges of their work, and helping those of us who use and study these platforms grasp those challenges too. They show how as employment relationships grow more remote and distributed across the network, workers can adapt, using technology to forge their own workplace culture.”
video | 01.08.18
At this year’s Coalition for Networked Information (CNI) Fall 2017 Membership Meeting, Bonnie Tijerina spoke about the implications of using data science in libraries.
“Data science approaches may shed light on otherwise hard-to-see problems in the library.” — D&S researcher Bonnie Tijerina
video | 12.06.17
The informational, economic, and political influence of the dominant tech platforms — Google, Facebook, and Amazon in particular — has become a central topic of debate. In this talk, K. Sabeel Rahman argues that these firms are best understood as the core infrastructure of our 21st century economy and public sphere. The infrastructural power of these firms raises a range of policy questions. What exactly about these firms (e.g., their accumulation of data, their gatekeeping functions, their control over vital public and economic functions like retail delivery or online speech) is “infrastructural?” How should these infrastructural functions be governed and regulated, in light of both their economic and political influence?
Professor Rahman sketches some tentative answers to these questions, drawing on the intellectual history of early 20th century “public utility regulation,” where reformers developed a compelling approach to diagnosing and remedying the problem of private power over the essential infrastructure of the industrial economy, from railroads to finance.
This history suggests some design principles and opens up some novel implications for addressing the problem of platform power in the digital economy. The talk explores more contemporary analogies and applications in the context of our current debates over informational platforms, big data, AI, and algorithms, in order to sketch out some principles for what a public utility-style regulatory approach to Internet platforms would look like.
K. Sabeel Rahman is a Visiting Professor of Law at Harvard Law School, an Assistant Professor of Law at Brooklyn Law School, and a Fellow at the Roosevelt Institute. Rahman earned his AB at Harvard College summa cum laude in Social Studies and returned to Harvard for his JD at Harvard Law School and his PhD in the Harvard Government Department. He also has degrees in Economics and Sociolegal Studies from Oxford, where he was a Rhodes Scholar.
Wired | 12.06.17
Miranda Katz of WIRED interviews D&S founder and president danah boyd on the evolving public discourse around disinformation and how the tech industry can help rebuild American society.
“It’s actually really clear: How do you reknit society? Society is produced by the social connections that are knit together. The stronger those networks, the stronger the society. We have to make a concerted effort to create social ties, social relationships, social networks in the classic sense that allow for strategic bridges across the polis so that people can see themselves as one.”
Medium | 11.30.17
Data & Society Researcher Jacob Metcalf argues for an ethical approach to data science and offers strategies for future research.
“On the one hand, it is banally predictable that the consequences of machine-learning-enabled surveillance will fall disproportionately on demographic minorities. On the other hand, queer folks hardly need data scientists scrutinizing their jawlines and hairstyles to warm them about this. They have always known this.”
Wired | 11.29.17
On Wednesday Nov. 29th, the Supreme Court heard Carpenter vs. U.S., a 4th amendment case on cell data access. Postdoctoral scholars Julia Ticona & Andrew Selbst urged the court to understand that cell phones aren’t voluntary in this day and age.
“The justices will surely understand that without any alternatives for accessing online services, vulnerable (and over-policed) populations will be unable to make meaningful choices to protect their privacy, amplifying the disadvantages they already face.”
The Daily 360 interviews D&S INFRA Lead Ingrid Burrington about how we can see the internet in everyday life.
“First place to look if you’re looking for the internet on a city street is down because a lot of it runs through fiberoptic cables that are buried under the street.”
Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.
Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”
This is a transcript of Data & Society founder and president danah boyd’s recent lightening talk at The People’s Disruption: Platform Co-Ops for Global Challenges.
“But as many of you know, power corrupts. And the same geek masculinities that were once rejuvenating have spiraled out of control. Today, we’re watching as diversity becomes a wedge issue that can be used to radicalize disaffected young men in tech. The gendered nature of tech is getting ugly.”
Chalkbeat | 11.09.17
D&S research Claire Fontaine explores issues around accessibility at NYC schools.
“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.
But for some families, the choice process begins and ends with the question: Is the building fully accessible?”
American Association of Pediatrics Journal | 11.03.17
D&S researcher Monica Bulger co-authored an article on the way children are engaging with technology nationally.
“Beyond revealing pressing and sizeable gaps in knowledge, this cross-national review also reveals the importance of understanding local values and practices regarding the use of technologies. This leads us to stress that future researchers must take into account local contexts and existing inequalities and must share best practices internationally so that children can navigate the balance between risks and opportunities.”
D&S founder and President danah boyd & affiliate Solon Barocas investigate the practice of ethics in data science.
“Critical commentary on data science has converged on a worrisome idea: that data scientists do not recognize their power and, thus, wield it carelessly. These criticisms channel legitimate concerns about data science into doubts about the ethical awareness of its practitioners. For these critics, carelessness and indifference explains much of the problem—to which only they can offer a solution.”
Columbia Law Review | 11.01.17
D&S researcher Alex Rosenblat co-authored an article on power dynamics in the sharing economy.
“Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value but raises concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power.”
New York Magazine | 10.31.17
D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.
“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”
D&S researcher Mary Madden shared her findings from her recent report “Privacy, Security and Digital Inequality.”
“Not only do Americans with lower levels of income and education have fewer technology resources, but they also express heightened sensitivities about a range of data privacy concerns.”
D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.
“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”
D&S INFRA Lead Ingrid Burrington reflects on the concept of the future in anticipation of the upcoming show Futureproof that she curated at Haverford College.
“Trying to build alternative futures is often a process of facing that haunting spectre: finding life or potential by invoking and living with the ghosts and weird spirits of a world that could have been. Often, the interface for visiting these particular ghosts isn’t the medium or Ouija board but the archive, which is partly why so many of the works in Futureproof take on an archivist, museological tone. The alternative archive is historical evidence of a shift in the timeline, its own kind of proof that another timeline is not only possible, but has already happened, is already happening and emergent before us.”
On September 27th, D&S Fellow Taeyoon Choi released the first two chapters of his online-book “Poetic Computation: Reader,” which looks at code as a form of poetry as well as the ethics behind it. As an online-book, readers have the unique experience of customizing the design elements of the text to their preferred standards as they read.
Choi is co-founder of The School for Poetic Computation based in New York City, and the book is based off of two of his lectures from the curriculum. The following chapters will be published later this year.
Centre for Public Impact | 08.22.17
In late August, D&S Researcher Anne Washington talked real-world implications of AI with the Centre for Public Impact’s Joel Tito.
Washington, a digital government scholar whose work addresses emerging policy needs for data science, tells Joel about what she is most afraid of when it comes to artificial intelligence (AI) and its application to government. She also explains that while AI can make processes more efficient and better streamlined, it shouldn’t be used for “really complicated human decisions”. Find out why here, as well as if she thinks we should seek inspiration from the Romans or ancient Greeks when it comes to AI and government…
In this blogpost, Zachary Gold dives into the implications of the Children’s Internet Protection Act (CIPA), launched in 2000.
“Many parents certainly worry about their children getting access to inappropriate material online, and CIPA may have been a reasonable way to address that concern when it was passed. The devices we use, and the way we use the internet, have changed drastically since then. Updating CIPA, or replacing it to govern these new devices and connections being used by students could do more harm than good. Keeping pornography out of student’s schoolrooms is important, but filtering and monitoring student’s internet activity around town and at home blurs the role of school administrators.”
“Privacy, Security, and Digital Inequality” by Mary Madden is the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.
Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.
In light of the September 18th announcement by the U.S. Department of Homeland Security about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.
“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.
“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”
In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.
“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”
Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”
“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.
 Full text here.
 The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.
For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:
Center for Media Justice – Resource Library
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties Union – Privacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Rad.Cat (link goes to resources)
Southern Poverty Law Center
Mic | 08.16.17
D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.
Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.
‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’
Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.
Wired | 08.15.17
D&S Researcher Alex Rosenblat was interviewed about Uber for Klint Finley’s article in Wired
Tuesday’s agreement may not be the end of Uber’s problems with the FTC either. Hartzog says a recent paper by University of Washington law professor Ryan Calo and multidisciplinary researcher Alex Rosenblat of the research institute Data & Society points to other potential privacy concerns, such as monitoring how much battery power remains on a user’s device, because users with little juice might be willing to pay more for a ride.
‘When a company can design an environment from scratch, track consumer behavior in that environment, and change the conditions throughout that environment based on what the firm observes, the possibilities to manipulate are legion,’ Calo and Rosenblat write. ‘Companies can reach consumers at their most vulnerable, nudge them into overconsumption, and charge each consumer the most he or she may be willing to pay.’
Quartz | 08.14.17
Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:
Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.
That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.
D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance
‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’
The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.
testimony | 08.11.17
On August 11, 2017, Data & Society and fifteen individual scholars—including danah boyd, Julia Ticona, and Amanda Lenhart—filed an amicus brief in a pending U.S. Supreme Court case, Carpenter v. United States. The parties were represented by Andrew Selbst of Data & Society, and Marcia Hofmann and Kendra Albert of Zeitgeist Law.
The case implicates the Fourth Amendment’s “third party doctrine,” which states that that people who “voluntarily convey” information to third parties do not have reasonable expectation of privacy. As a result, when police obtain records from a third party, it does not currently implicate Fourth Amendment rights.
Timothy Carpenter was convicted for a string of armed robberies based on cell site location data which placed him in proximity of the armed robberies he was accused of partaking in. The case concerns the legality under the Fourth Amendment of the warrantless search and seizure of Carpenter’s historical cellphone records, which reveal his location and movements over the course of 127 days.
In the brief, we argue that the “third party doctrine” should not apply to cell site location information because cell phones are not meaningfully voluntary in modern society. Cell site location information contains abundant information about people’s lives, and unfettered police access to it poses a threat to privacy rights.
Aided by scholarship and statistics from the Data & Society research team, we provide evidence that the 95% of Americans that have cell phones cannot reasonably be expected to opt out of owning a cell phone to avoid police searches. The research shows that cell phones are:
The case is expected to be heard in the fall of 2017.
paper | 08.09.17
Data & Society Researcher Alexandra Mateescu maps out the inequalities and power dynamics within the gig economy.
“As on-demand companies like Handy and online marketplaces like Care.com enter the space of domestic work, a range of questions emerge: what are the risks and challenges of signing up for platform-based work as an immigrant? As a non-native English speaker? How are experiences of work different for individuals with strong professional identities as caregivers or housekeepers, versus more casual workers who may also be finding other kinds of work via Postmates or Uber?”
TechTarget | 07.07.18
D&S Researcher Madeleine Clare Elish discusses the implications of biased AI in different contexts.
She said when AI is applied to areas like targeted marketing or customer service, this kind of bias is essentially an inconvenience. Models won’t deliver good results, but at the end of the day, no one gets hurt.
The second type of bias, though, can be more impactful to people. Elish talked about how AI is increasingly seeping into areas like insurance, credit scoring and criminal justice. Here, biases, whether they result from unrepresentative data samples or from unconscious partialities of developers, can have much more severe effects.
D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.
“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”
WNYC The Takeaway | 08.17.16
D&S lawyer-in-residence Rebecca Wexler describes the intersection of automated technologies, trade secrets, and the criminal justice system.
For-profit companies dominate the criminal justice technologies industry and produce computer programs that are widespread throughout the justice system. These automated programs deploy cops, analyze forensic evidence, and assess the risk levels of inmates. But these technological advances may be making the system less fair, and without access to the source code, it’s impossible to hold computers to account.
D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.
What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.
D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system
The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.
Ford Foundation blog | 05.30.17
D&S affiliate Wilneida Negrón details the role of bots and automation in activism today.
As everyone from advertisers to political adversaries jockey for attention, they are increasingly using automated technologies and processes to raise their own voices or drown out others. In fact, 62 percent of all Internet traffic is made up of programs acting on their own to analyze information, find vulnerabilities, or spread messages. Up to 48 million of Twitter’s 320 million users are bots, or applications that perform automated tasks. Some bots post beautiful art from museum collections, while some spread abuse and misinformation instead. Automation itself isn’t cutting edge, but the prevalence and sophistication of how automated tools interact with users is.
Sage Journals | 05.30.17
D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.
Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.
Protecting Patron Privacy, edited by Bobbi Newman and Data & Society Researcher Bonnie Tijerina, suggests strategies for data privacy in libraries.
Although privacy is one of the core tenets of librarianship, technology changes have made it increasingly difficult for libraries to ensure the privacy of their patrons in the 21st century library.
This authoritative LITA Guide offers readers guidance on a wide range of topics, including:
• Foundations of privacy in libraries
• Data collection, retention, use, and protection
• Laws and regulations
• Privacy instruction for patrons and staff
• Contracts with third parties
• Use of in-house and internet tools including social network sites, surveillance video, and RFID