D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.
“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”
D&S INFRA Lead Ingrid Burrington reflects on the concept of the future in anticipation of the upcoming show Futureproof that she curated at Haverford College.
“Trying to build alternative futures is often a process of facing that haunting spectre: finding life or potential by invoking and living with the ghosts and weird spirits of a world that could have been. Often, the interface for visiting these particular ghosts isn’t the medium or Ouija board but the archive, which is partly why so many of the works in Futureproof take on an archivist, museological tone. The alternative archive is historical evidence of a shift in the timeline, its own kind of proof that another timeline is not only possible, but has already happened, is already happening and emergent before us.”
On September 27th, D&S Fellow Taeyoon Choi released the first two chapters of his online-book “Poetic Computation: Reader,” which looks at code as a form of poetry as well as the ethics behind it. As an online-book, readers have the unique experience of customizing the design elements of the text to their preferred standards as they read.
Choi is co-founder of The School for Poetic Computation based in New York City, and the book is based off of two of his lectures from the curriculum. The following chapters will be published later this year.
Centre for Public Impact | 08.22.17
In late August, D&S Researcher Anne Washington talked real-world implications of AI with the Centre for Public Impact’s Joel Tito.
Washington, a digital government scholar whose work addresses emerging policy needs for data science, tells Joel about what she is most afraid of when it comes to artificial intelligence (AI) and its application to government. She also explains that while AI can make processes more efficient and better streamlined, it shouldn’t be used for “really complicated human decisions”. Find out why here, as well as if she thinks we should seek inspiration from the Romans or ancient Greeks when it comes to AI and government…
In this blogpost, Zachary Gold dives into the implications of the Children’s Internet Protection Act (CIPA), launched in 2000.
“Many parents certainly worry about their children getting access to inappropriate material online, and CIPA may have been a reasonable way to address that concern when it was passed. The devices we use, and the way we use the internet, have changed drastically since then. Updating CIPA, or replacing it to govern these new devices and connections being used by students could do more harm than good. Keeping pornography out of student’s schoolrooms is important, but filtering and monitoring student’s internet activity around town and at home blurs the role of school administrators.”
“Privacy, Security, and Digital Inequality” by Mary Madden is the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.
Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.
In light of the September 18th announcement by the U.S. Department of Homeland Security about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.
“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.
“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”
In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.
“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”
Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”
“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.
 Full text here.
 The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.
For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:
Center for Media Justice – Resource Library
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties Union – Privacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Rad.Cat (link goes to resources)
Southern Poverty Law Center
Mic | 08.16.17
D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.
Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.
‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’
Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.
Wired | 08.15.17
D&S Researcher Alex Rosenblat was interviewed about Uber for Klint Finley’s article in Wired
Tuesday’s agreement may not be the end of Uber’s problems with the FTC either. Hartzog says a recent paper by University of Washington law professor Ryan Calo and multidisciplinary researcher Alex Rosenblat of the research institute Data & Society points to other potential privacy concerns, such as monitoring how much battery power remains on a user’s device, because users with little juice might be willing to pay more for a ride.
‘When a company can design an environment from scratch, track consumer behavior in that environment, and change the conditions throughout that environment based on what the firm observes, the possibilities to manipulate are legion,’ Calo and Rosenblat write. ‘Companies can reach consumers at their most vulnerable, nudge them into overconsumption, and charge each consumer the most he or she may be willing to pay.’
Quartz | 08.14.17
Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:
Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.
That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.
D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance
‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’
The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.
testimony | 08.11.17
On August 11, 2017, Data & Society and fifteen individual scholars—including danah boyd, Julia Ticona, and Amanda Lenhart—filed an amicus brief in a pending U.S. Supreme Court case, Carpenter v. United States. The parties were represented by Andrew Selbst of Data & Society, and Marcia Hofmann and Kendra Albert of Zeitgeist Law.
The case implicates the Fourth Amendment’s “third party doctrine,” which states that that people who “voluntarily convey” information to third parties do not have reasonable expectation of privacy. As a result, when police obtain records from a third party, it does not currently implicate Fourth Amendment rights.
Timothy Carpenter was convicted for a string of armed robberies based on cell site location data which placed him in proximity of the armed robberies he was accused of partaking in. The case concerns the legality under the Fourth Amendment of the warrantless search and seizure of Carpenter’s historical cellphone records, which reveal his location and movements over the course of 127 days.
In the brief, we argue that the “third party doctrine” should not apply to cell site location information because cell phones are not meaningfully voluntary in modern society. Cell site location information contains abundant information about people’s lives, and unfettered police access to it poses a threat to privacy rights.
Aided by scholarship and statistics from the Data & Society research team, we provide evidence that the 95% of Americans that have cell phones cannot reasonably be expected to opt out of owning a cell phone to avoid police searches. The research shows that cell phones are:
The case is expected to be heard in the fall of 2017.
paper | 08.09.17
Data & Society Researcher Alexandra Mateescu maps out the inequalities and power dynamics within the gig economy.
“As on-demand companies like Handy and online marketplaces like Care.com enter the space of domestic work, a range of questions emerge: what are the risks and challenges of signing up for platform-based work as an immigrant? As a non-native English speaker? How are experiences of work different for individuals with strong professional identities as caregivers or housekeepers, versus more casual workers who may also be finding other kinds of work via Postmates or Uber?”
TechTarget | 07.07.18
D&S Researcher Madeleine Clare Elish discusses the implications of biased AI in different contexts.
She said when AI is applied to areas like targeted marketing or customer service, this kind of bias is essentially an inconvenience. Models won’t deliver good results, but at the end of the day, no one gets hurt.
The second type of bias, though, can be more impactful to people. Elish talked about how AI is increasingly seeping into areas like insurance, credit scoring and criminal justice. Here, biases, whether they result from unrepresentative data samples or from unconscious partialities of developers, can have much more severe effects.
D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.
“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”
WNYC The Takeaway | 08.17.16
D&S lawyer-in-residence Rebecca Wexler describes the intersection of automated technologies, trade secrets, and the criminal justice system.
For-profit companies dominate the criminal justice technologies industry and produce computer programs that are widespread throughout the justice system. These automated programs deploy cops, analyze forensic evidence, and assess the risk levels of inmates. But these technological advances may be making the system less fair, and without access to the source code, it’s impossible to hold computers to account.
D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.
What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.
D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system
The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.
Ford Foundation blog | 05.30.17
D&S affiliate Wilneida Negrón details the role of bots and automation in activism today.
As everyone from advertisers to political adversaries jockey for attention, they are increasingly using automated technologies and processes to raise their own voices or drown out others. In fact, 62 percent of all Internet traffic is made up of programs acting on their own to analyze information, find vulnerabilities, or spread messages. Up to 48 million of Twitter’s 320 million users are bots, or applications that perform automated tasks. Some bots post beautiful art from museum collections, while some spread abuse and misinformation instead. Automation itself isn’t cutting edge, but the prevalence and sophistication of how automated tools interact with users is.
Sage Journals | 05.30.17
D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.
Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.
Protecting Patron Privacy, edited by Bobbi Newman and Data & Society Researcher Bonnie Tijerina, suggests strategies for data privacy in libraries.
Although privacy is one of the core tenets of librarianship, technology changes have made it increasingly difficult for libraries to ensure the privacy of their patrons in the 21st century library.
This authoritative LITA Guide offers readers guidance on a wide range of topics, including:
• Foundations of privacy in libraries
• Data collection, retention, use, and protection
• Laws and regulations
• Privacy instruction for patrons and staff
• Contracts with third parties
• Use of in-house and internet tools including social network sites, surveillance video, and RFID
Harvard Business Review | 05.16.17
D&S researcher Mark Latonero provides an overview of the role of large tech companies in refugee crises.
While the 40-page brief is filled with arguments in support of immigration, it hardly speaks about refugees, except to note that those seeking protection should be welcomed. Any multinational company with a diverse workforce would be concerned about limits to international hiring and employee travel. But tech companies should also be concerned about the refugee populations that depend on their digital services for safety and survival.
report | 05.15.17
New Report Reveals Why Media Was Vulnerable to Radicalized Groups Online
New America | 05.11.17
D&S affiliate Seeta Peña Gangadharan writes about defending digital rights of library patrons.
If this sounds complicated and scary, that’s because it is. But confronted with this matrix of vulnerabilities, the library—with its longstanding commitment to patron privacy—also offers an impressive plan of action.
Quartz | 05.10.17
D&S affiliate Mimi Onuoha states that discarded and sold hardware often has data still on it.
It’s not just individuals who are lax about removing data, companies around the world are at fault as well. In a 2007 study researchers in Canada obtained 60 secondhand drives that had previously belonged to health care facilities. They were able to recover personal information from 65% of the drives. The data included, in the words of the researchers, “very sensitive mental health information on a large number of people.”
The Trace | 05.03.17
First Monday | 05.01.17
Philip Napoli and D&S researcher Robyn Caplan write on why companies like Google and Facebook insist that they are merely tech companies with no media impact, and why they are wrong for First Monday. Abstract is below:
A common position amongst social media platforms and online content aggregators is their resistance to being characterized as media companies. Rather, companies such as Google, Facebook, and Twitter have regularly insisted that they should be thought of purely as technology companies. This paper critiques the position that these platforms are technology companies rather than media companies, explores the underlying rationales, and considers the political, legal, and policy implications associated with accepting or rejecting this position. As this paper illustrates, this is no mere semantic distinction, given the history of the precise classification of communications technologies and services having profound ramifications for how these technologies and services are considered by policy-makers and the courts.
In this Points piece, D&S researcher Claire Fontaine argues that educational marketplaces disproportionately advantage those with surplus time, energy, social capital, and institutional knowledge.
“Schooling is not a service, and it is not a commodity. It is — or should be — the means through which a society as diverse as our own coheres and develops a functional social fabric.”
Harvard Business Review | 04.19.17
Jongbin Jung, Connor Concannon, D&S fellow Ravi Shroff, Sharad Goel, and Daniel G. Goldstein explore new methods for machine learning in criminal justice.
Simple rules certainly have their advantages, but one might reasonably wonder whether favoring simplicity means sacrificing performance. In many cases the answer, surprisingly, is no. We compared our simple rules to complex machine learning algorithms. In the case of judicial decisions, the risk chart above performed nearly identically to the best statistical risk assessment techniques. Replicating our analysis in 22 varied domains, we found that this phenomenon holds: Simple, transparent decision rules often perform on par with complex, opaque machine learning methods.
American Press Institute | 04.12.17
D&S researcher Mary Madden was interviewed by the American Press Institute about Madden’s recent Knight Foundation-supported report, “How Youth Navigate the News Landscape.”
However, one of my favorite quotes was from a participant who described a future where news would be delivered by hologram: “I think like it’s going to be little holograms. You’re going to open this thing and a little guy’s going to come out and tell you about stuff.”
Given that some participants said they already found notifications annoying, I’m not sure how successful the little hologram guy would be, but it was clear that the participants fully expected that the news industry would continue to evolve and innovate in creative ways moving forward.
Centre for Public Impact | 04.12.17
D&S fellow Anne L. Washington discusses her previous research as a digital government scholar and her upcoming work examining US open data policy, funded through a five-year National Science Foundation Early Faculty Research Career grant.
“We use a secret language in academia sometimes,” [Washington] says, laughing. “‘Technology management’ is about how organisations leverage digital assets for strategic business goals. My doctorate is in management information systems. On the other side of that is ‘informatics’, which comes from the library science tradition. Over centuries, librarians have refined how to store and retrieve knowledge so people can find what they need and walk away smarter. Informatics takes this basic idea and scales it up for massive digital collections.”
ProPublica | 04.05.17
Rhizome | 04.05.17
D&S artist-in-residence Ingrid Burrington, with Josh Begley and Seth Freed Wessler, created an installation on presentation at the Ace Hotel.
Building on Wessler’s journalistic investigation into privately run immigrant-only federal prisons, Burrington and Begley present seventy-five individual lenticular prints of satellite imagery capturing these sites and government documents pertaining to them. Together the images, arranged into three distinct grids, explore and abstract “the terrain of U.S. immigration and carceral policy and the human stories usually conspicuously absent in the aerial perspective.”
paper | 04.02.17
Sharad Goel, Maya Perelman, D&S fellow Ravi Shroff, and David Alan Sklansky examine a method that can “reduce the racially disparate impact of pedestrian searches and to increase their effectiveness”. Abstract is below:
The exponential growth of available information about routine police activities offers new opportunities to improve the fairness and effectiveness of police practices. We illustrate the point by showing how a particular kind of calculation made possible by modern, large-scale datasets — determining the likelihood that stopping and frisking a particular pedestrian will result in the discovery of contraband or other evidence of criminal activity — could be used to reduce the racially disparate impact of pedestrian searches and to increase their effectiveness. For tools of this kind to achieve their full potential in improving policing, though, the legal system will need to adapt. One important change would be to understand police tactics such as investigatory stops of pedestrians or motorists as programs, not as isolated occurrences. Beyond that, the judiciary will need to grow more comfortable with statistical proof of discriminatory policing, and the police will need to be more receptive to the assistance that algorithms can provide in reducing bias.
PLOS Computational Biology | 03.30.17
Matthew Zook, D&S affiliate Solon Barocas, D&S founder danah boyd, D&S affiliate Kate Crawford, Emily Keller, D&S affiliate Seeta Peña Gangadharan, Alyssa Goldman, Rachelle Hollander, Barbara A. Koenig, D&S researcher Jacob Metcalf, Arvind Narayanan, D&S advisor Alondra Nelson, and Frank Pasquale wrote a paper detailing ten rules for responsible big data research. Introduction is below:
The use of big data research methods has grown tremendously over the past five years in both academia and industry. As the size and complexity of available datasets has grown, so too have the ethical questions raised by big data research. These questions become increasingly urgent as data and research agendas move well beyond those typical of the computational and natural sciences, to more directly address sensitive aspects of human behavior, interaction, and health. The tools of big data research are increasingly woven into our daily lives, including mining digital medical records for scientific and economic insights, mapping relationships via social media, capturing individuals’ speech and action via sensors, tracking movement across space, shaping police and security policy via “predictive policing,” and much more.
The beneficial possibilities for big data in science and industry are tempered by new challenges facing researchers that often lie outside their training and comfort zone. Social scientists now grapple with data structures and cloud computing, while computer scientists must contend with human subject protocols and institutional review boards (IRBs). While the connection between individual datum and actual human beings can appear quite abstract, the scope, scale, and complexity of many forms of big data creates a rich ecosystem in which human participants and their communities are deeply embedded and susceptible to harm. This complexity challenges any normative set of rules and makes devising universal guidelines difficult.
Nevertheless, the need for direction in responsible big data research is evident, and this article provides a set of “ten simple rules” for addressing the complex ethical issues that will inevitably arise. Modeled on PLOS Computational Biology’s ongoing collection of rules, the recommendations we outline involve more nuance than the words “simple” and “rules” suggest. This nuance is inevitably tied to our paper’s starting premise: all big data research on social, medical, psychological, and economic phenomena engages with human subjects, and researchers have the ethical responsibility to minimize potential harm.
The variety in data sources, research topics, and methodological approaches in big data belies a one-size-fits-all checklist; as a result, these rules are less specific than some might hope. Rather, we exhort researchers to recognize the human participants and complex systems contained within their data and make grappling with ethical questions part of their standard workflow. Towards this end, we structure the first five rules around how to reduce the chance of harm resulting from big data research practices; the second five rules focus on ways researchers can contribute to building best practices that fit their disciplinary and methodological approaches. At the core of these rules, we challenge big data researchers who consider their data disentangled from the ability to harm to reexamine their assumptions. The examples in this paper show how often even seemingly innocuous and anonymized data have produced unanticipated ethical questions and detrimental impacts.
This paper is a result of a two-year National Science Foundation (NSF)-funded project that established the Council for Big Data, Ethics, and Society, a group of 20 scholars from a wide range of social, natural, and computational sciences (http://bdes.datasociety.net/). The Council was charged with providing guidance to the NSF on how to best encourage ethical practices in scientific and engineering research, utilizing big data research methods and infrastructures.
Backchannel | 03.27.17
D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.
Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.
Washington University Law Review | 03.11.17
D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:
This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.
D&S advisor Nick Grossman discusses trust in the sharing economy.
So, fast forward to our refund situation: now I no longer feel like I have any moral high ground to demand a formal close out — in my mind, I was complicit in the shadiness when I was cool with fooling the apartment building. How is that any different than agreeing to sidestep the Airbnb platform rules?
D&S advisor Anil Dash discusses how interviews can exclude people from the tech industry.
When we mimic patterns from tech culture without knowing why we do them, we often take good ideas and turn them into terrible barriers.
Columbia Law Review | 03.07.17
Ryan Calo and D&S researcher Alex Rosenblat write this analysis of the newly termed ‘taking economy’ of Uber.
Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value and raises a set of concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power. This Article, coauthored by a law professor and a technology ethnographer who studies the ride-hailing community, furnishes such a critique and indicates a path toward a meaningful response.
Commercial firms have long used what they know about consumers to shape their behavior and maximize profits. By virtue of sitting between consumers and providers of services, however, sharing economy firms have a unique capacity to monitor and nudge all participants—including people whose livelihood may depend on the platform. Much activity is hidden away from view, but preliminary evidence suggests that sharing economy firms may already be leveraging their access to information about users and their control over the user experience to mislead, coerce, or otherwise disadvantage sharing economy participants.
This Article argues that consumer protection law, with its longtime emphasis of asymmetries of information and power, is relatively well positioned to address this under-examined aspect of the sharing economy. But the regulatory response to date seems outdated and superficial. To be effective, legal interventions must (1) reflect a deeper understanding of the acts and practices of digital platforms and (2) interrupt the incentives of sharing economy firms to abuse their position.
D&S advisor Nick Grossman discusses constructive approaches for tech companies to support important issues.
As the white house continues to issues executive orders on issues like immigration that hit tech companies directly, and as issues like transgender rights — that are outside the pocketbook interests but may intersect with a company or community’s values — come up, it feels as though companies are going to continue to be under pressure to take public stands.
D&S advisor Anil Dash discusses Fake Markets that are dominated by few tech companies.
Worse, we’ve lost the ability to discern that a short-term benefit for some users that’s subsidized by an unsustainable investment model will lead to terrible long-term consequences for society. We’re hooked on the temporary infusion of venture capital dollars into vulnerable markets that we know are about to be remade by technological transformation and automation. The only social force empowered to anticipate or prevent these disruptions are policymakers who are often too illiterate to understand how these technologies work, and who too desperately want the halo of appearing to be associated with “high tech”, the secular religion of America.
report | 03.01.17
Slate | 02.28.17
D&S lawyer-in-residence Rebecca Wexler analyzes the unreliability of video authenticating in Slate.
When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.
paper | 02.21.17
D&S lawyer-in-residence Rebecca Wexler provides an analysis on trade secrecy in the criminal justice system. Abstract is below:
From policing to evidence to parole, data-driven algorithmic systems and other automated software programs are being adopted throughout the criminal justice system. The developers of these technologies often claim that the details about how the programs work are trade secrets and, as a result, cannot be disclosed in criminal cases. This Article turns to evidence law to examine the conflict between transparency and trade secrecy in the criminal justice system. It is the first comprehensive account of trade secret evidence in criminal cases. I argue that recognizing a trade secrets evidentiary privilege in criminal proceedings is harmful, ahistorical, and unnecessary. Withholding information from the accused because it is a trade secret mischaracterizes due process as a business competition.
D&S researcher Alex Rosenblat was interviewed by Radio NZ about Uber and the promises it makes its drivers, i.e. flexible hours and freedom.
D&S artist-in-residence Ingrid Burrington explores the importance of domain names at NamesCon, an annual conference for the domain-names industry.
In addition to being crucial to making the web work, domain names are also a highly political pocket of the web, particularly shaped by the legacy of colonialism. Most of the underlying protocols that make the internet work—including DNS—are encoded in ASCII, which translates bits into letterforms, numbers, and punctuation marks. But ASCII’s letterforms only represent the Latin alphabet, limiting expression in domain names to Western languages (while arguing that a character encoding is an instrument of imperialism sounds bold, so does assuming that “text” is synonymous only with “English”).
D&S lawyer-in-residence Rebecca Wexler testifies about government oversight of forensic science laboratories in the State of New York.
I submit these comments to the Assembly Standing Committee on Codes; the Assembly Standing Committee on Judiciary and the Assembly Standing Committee on Oversight, Analysis and Investigation. Thank you for inviting my testimony on government oversight of forensic science laboratories in the State of New York. As a Resident at The Data and Society Research Institute, my work focuses on issues arising from data and technology in the criminal justice system. I want to draw your attention to trade secrets claims in forensic technologies that threaten criminal defendant’s rights to confront and cross-examine the evidence against them; to compulsory process to obtain evidence in their favor; and to due process.
The Guardian | 02.02.17
D&S affiliate Keith Hiatt, Michael Kleinman, and D&S researcher Mark Latonero think critically about the usage of technology as a all-encompassing solution in human rights spaces.
It’s important to acknowledge that, most of the time, the underlying problem human rights organisations are trying to solve isn’t technical. It’s often a bureaucratic, institutional, process or workflow problem, and technology won’t solve it (and might exacerbate it).
D&S researcher Monica Bulger, with Patrick McCormick and D&S research analyst Mikaela Pitcan, writes this working paper detailing the “Legacy of inBloom”.
Although inBloom closed in 2014, it ignited a public discussion of student data privacy that resulted in the introduction of over 400 pieces of state-level legislation. The fervor over inBloom showed that policies and procedures were not yet where they needed to be for schools to engage in data-informed instruction. Industry members responded with a student data privacy pledge that detailed responsible practice. A strengthened awareness of the need for transparent data practices among nearly all of the involved actors is one of inBloom’s most obvious legacies.
Instead of a large-scale, open source platform that was a multi-state collaboration, the trend in data-driven educational technologies since inBloom’s closure has been toward closed, proprietary systems, adopted piecemeal. To date, no large-scale educational technology initiative has succeeded in American K-12 schools. This study explores several factors that contributed to the demise of inBloom and a number of important questions: What were the values and plans that drove inBloom to be designed the way it was? What were the concerns and movements that caused inBloom to run into resistance How has the entire inBloom development impacted the future of edtech and student data?
D&S advisor John Palfrey speaks to his students about the President’s executive orders on immigration.
There has been much talk of universities and schools committing to be “sanctuaries” for students. There is merit in this idea but there is also a lot of debate as to what it means, in a legal sense. I would simplify how I see it: I aspire for our school to be a home for our students–a home away from home to be sure–one where our youth from every quarter and from every religion know that they will have every protection we can manage, just as we would offer our own children at home.
D&S affiliate Mimi Onuoha details the process of completely deleting data.
This overwriting process is a bit like painting a wall: If you start with a white wall and paint it red, there’s no way to erase the red. If you want the red gone or the wall returned to how it was, you either destroy the wall or you paint it over, several times, so that it’s white again.
D&S founder danah boyd responds to remarks by the Trump administration stating that their opposition is the media.
And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.
Nick Grossman ponders the importance of and vision for the public data layer.
If we do this right, we can get smarter at policymaking, and design regulatory systems that have both greater effectiveness and lower costs of implementation and compliance.
D&S fellow Zara Rahman writes about how immigrant families use social media and digital technologies.
The consequence is that the home of our deeply personal information has gone from treasured letters stored in a box at our houses, to servers owned by corporate companies that we’ll never see. Those personal notes, the ways of showing our family that we’re happy and content in our new lives, despite what we’ve lost — they live online now. The more you share with that corporation, the stronger those family ties get. There is a third party in these relationships.
points | 01.25.17
D&S researcher Claire Fontaine’s debut on Points, “Doing Screen Time,” resourcefully unwinds the apparent contradiction between anxiety around screen time at home and support for screen time at school: “Each produces and enables the other.” Looking into this dynamic is an occasion for asking, collectively, how we want to live.
D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.
The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.
D&S affiliate Kate Crawford writes a letter to Silicon Valley on how to resist Trump.
You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment.
D&S affiliate Kate Crawford and Hito Steyerl converse about “NSA bros, dataveillance, apex predators, AI, and empathy machines.”
D&S artist-in-residence Ingrid Burrington discusses associating weaponry, like drones, with art.
The impulse to pair a technology associated with automated extralegal killing of American citizens alongside “culture and the arts” is weird, but not entirely surprising—the vantage point of drones affords a particular aesthetic in addition to plausible deniability. The aerial perspective has appealed to artists for as long as it has appealed to generals and kings. That distant, presumed-objective view from nowhere, whether achieved via hot air balloon or low-orbit satellite, suggests a totality, a kind of coherence in defiance of the often-incoherent groundtruth of everyday life. For generals, coherence offers the possibility of tactical advantage. For artists (or at least good artists), it’s something to interrogate and take apart.
D&S advisor Ethan Zuckerman provides a transcript on his recent speech about journalism and civics.
One final thing: we have this tendency in journalism right now to feel very sorry for ourselves. This is a field that we are all enormously proud to be part of. This is a field that is harder and harder to make a living in, and I see more and more news organizations essentially saying, “You’re going to miss us. We’re going away. I just want to warn you.”
report | 01.18.17
12% of U.S. internet users who have been in romantic relationships have experienced intimate partner digital abuse…
Digital tools are often an integral part of healthy romantic relationships. Romantic partners frequently use digital tools to connect with each other through text messages, photo-sharing, social media posts, and other online activities. These same digital tools can be used in unhealthy ways, facilitating negative behaviors such as monitoring, unwanted picture sharing, and abusive messages — both within the romantic relationship and after the relationship is over. Better understanding how often intimate partner digital abuse is happening, to whom, and in what ways are critical pieces to understanding the scope of the problem.
This report, part of a series of research reports on digital harassment and abuse, examines the prevalence and impact of intimate partner digital abuse. Findings are based upon the results of a nationally representative survey of 3,002 Americans 15 years of age and older conducted from May 17th through July 31st, 2016. Respondents were surveyed on either their landline or cell phone. Interviews were conducted in either English or Spanish. Findings in this report refer to the 2,810 respondents who have ever been in a romantic relationship.
D&S artist-in-residence Heather Dewey-Hagborg partnered with Chelsea Manning and Shoili Kanungo to create Suppressed Images, an illustrated series about Dewey-Hagborg and Manning’s collaboration.
blog post | 01.17.17
D&S advisor Susan Crawford writes about communication infrastructure in rural parts of America.
This year, I’ll be traveling the US talking to people in scrappy communities who are building fiber on their own. They’re fed up with waiting for enormous incumbent communications companies to decide it’s in their corporate interests to invest in 21st-century communications capacity for Americans. These communities have run the numbers and looked at their economic development needs — as well as the possibilities for advanced healthcare, world-class educations, effective governance, energy management, and public safety that publicly controlled wholesale “street grids” of fiber make real — and they’ve come to the conclusion that if they hang back, they’ll become irrelevant.
New Media & Society | 01.16.17
D&S researcher Monica Bulger, with Patrick Burton, Brian O’Neill, and Elisabeth Staksrud, writes “Where policy and practice collide: Comparing United States, South African and European Union approaches to protecting children online”.
That children have a right to protection when they go online is an internationally well-established principle, upheld in laws that seek to safeguard children from online abuse and exploitation. However, children’s own transgressive behaviour can test the boundaries of this protection regime, creating new dilemmas for lawmakers the world over. This article examines the policy response from both the Global North and South to young people’s online behaviour that may challenge adult conceptions of what is acceptable, within existing legal and policy frameworks. It asks whether the ‘childhood innocence’ implied in much protection discourse is a helpful basis for promoting children’s rights in the digital age. Based on a comparative analysis of the emerging policy trends in Europe, South Africa and the United States, the article assesses the implications for policymakers and child welfare specialists as they attempt to redraw the balance between children’s online safety while supporting their agency as digital citizens.
D&S advisor Anil Dash presents the newest episode of On Being.
Back in November, I got to sit down with the amazing Krista Tippett for a lengthy interview in front of an incredibly warm crowd in Easton, MD. Now, that interview has been edited down and is available as the latest episode of Krista’s hugely popular show, On Being.
Quartz | 01.15.17
D&S affiliate Gideon Litchfield writes a short story called ‘Democracy 3.0’
The Guardian | 01.13.17
D&S founder danah boyd’s Points piece was re-published for The Guardian. boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.
Ford Foundation blog | 01.12.17
D&S advisor Claudia Perlich discusses modeling, transparency, and machine learning in a new episode of the Partially Derivative podcast.
“One pitfall I see is that it’s easy from a social science perspective to condemn all data science as evil…but that ultimately doesn’t help advance the situation.”
D&S fellow Zara Rahman details her thoughts on owning one’s success and hard work.
It’s unrealistic and unfair to ignore all that work – to myself, and others. Citing luck and serendipity gives the impression that people in positions of influence will somehow magically find out about you and your interests and reach out to you – they (probably) won’t. It implies that if you’re doing this right, opportunities to work on things you want to be working on will just pop up out of the blue.
D&S advisor Ethan Zuckerman reflects on today’s political atmosphere and FDR’s speech on the four freedoms.
This is a scary moment, a time where it looks like the progress we’ve made around the world might reverse, where we go from a world that’s gotten much bigger to one that shrinks. The good news is that we get to decide how big a world we want to live in. We get to decide how to speak, how to listen and how to stand together against fear.