Technology can often do more harm than good in humanitarian situations. In an op-ed for The New York Times, Research Lead Mark Latonero argues against surveillance humanitarianism.
“Despite the best intentions, the decision to deploy technology like biometrics is built on a number of unproven assumptions, such as, technology solutions can fix deeply embedded political problems. And that auditing for fraud requires entire populations to be tracked using their personal data. And that experimental technologies will work as planned in a chaotic conflict setting. And last, that the ethics of consent don’t apply for people who are starving.”
The New York Times | 05.07.19
The Algorithmic Accountability Act is a step forward, but there’s still room for improvement. Postdoctoral Scholar Andrew Selbst and Margot Kaminski explain.
“The bill is a meaningful first step in addressing the problems with algorithmic decision-making. Companies must be pushed to consider and document what goes into algorithm design. They should be pushed, too, to come up with solutions. But the bill is lacking in three main areas.”
The New York Times | 04.25.19
In this op-ed for The New York Times, Data & Society Research Lead Mary Madden argues that there is no “one size fits all” solution to privacy concerns in the digital age.
“When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and urgently need.”
Fast Company | 11.20.18
Data & Society 2015-2016 Fellow Wilneida Negron connects her past social work to her current work as a political scientist and technologist.
“We are at the cusp of a new wave of technological thinking, one defined by a new mantra that is the opposite of Zuckerberg’s: ‘Move carefully and purposely, and embrace complexity.’ As part of this wave, a new, inclusive, and intersectional generation of people are using technology for the public interest. This new wave will help us prepare for a future where technical expertise coexists with empathy, humility, and perseverance.”
Building off the research for her book Uberland: How Algorithms are Rewriting the Rules of Work, Data & Society Researcher Alex Rosenblat explains algorithmic management in the gig-economy.
“Data and algorithms are presented as objective, neutral, even benevolent: Algorithms gave us super-convenient food delivery services and personalized movie recommendations. But Uber and other ride-hailing apps have taken the way Silicon Valley uses algorithms and applied it to work, and that’s not always a good thing.”
Slate | 08.27.18
2017-2018 Fellow Jeanna Matthews and Research Analyst Kinjal Dave respond to Deji Olukotun’s story about an algorithmic tennis match.
“The answer can’t be derived from the past alone: It depends on what we collectively decide about the future, about what justice looks like, about leveling the playing field in sports and in life. As in Olukotun’s story, humans and computers will be working together to pick winners and losers. We need to collectively decide on and enforce the rules they will follow. We need the ability to understand, challenge, and audit the decisions. A level playing field won’t be the future unless we insist on it.”
Slate | 08.13.18
Drawing on conclusions from the Data & Society report Beyond Disruption, Researcher Alexandra Mateescu discusses surveillance of domestic care workers online.
“Online marketplaces may not be the root cause of individual employers’ biases, but their design is not neutral. They are built with a particular archetype of what an “entrepreneurial” domestic worker looks like—one who feels at home in the world of apps, social media, and online self-branding—and ultimately replicates and can even exacerbate many of the divisions that came with our predigital workplaces. As platform companies gain growing power over the hiring processes of a whole industry, they will need to actively work against the embedded inequalities in the markets they now mediate.”
In an op-ed for The New York Times, Data & Society Researcher Alex Rosenblat shatters the narrative that Uber encapsulates the entire gig-economy.
“But this industry has, until recently, operated largely informally, with jobs secured by word-of-mouth. That’s changing, as employers are increasingly turning to Uber-like services to find nannies, housecleaners and other care workers. These new gig economy companies, while making it easier for some people to find short-term work, have created hardships for others, and may leave many experienced care workers behind.”
Data & Society President and Founder danah boyd and Media Manipulation Research Lead Joan Donovan challenge newsrooms to practice “strategic silence” to avoid amplifying extremist messaging.
“Editors used to engage in strategic silence – set agendas, omit extremist ideas and manage voices – without knowing they were doing so. Yet the online context has enhanced extremists’ abilities to create controversies, prompting newsrooms to justify covering their spectacles. Because competition for audience is increasingly fierce and financially consequential, longstanding newsroom norms have come undone. We believe that journalists do not rebuild reputation through a race to the bottom. Rather, we think that it’s imperative that newsrooms actively take the high ground and re-embrace strategic silence in order to defy extremists’ platforms for spreading hate.”
For Slate, Data & Society Researcher Jacob Metcalf argues that we should be more concerned about behavioral models developed by entities like Cambridge Analytica, which can be traded between political entities, rather than the voter data itself.
” In other words, the one thing we can be sure of psychographic profiling is that it provided one more way to transfer knowledge and economic value between campaigns and organizations.”
MIT Technology Review | 04.09.18
In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.
“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”
Fast Company | 03.29.18
Data & Society Postdoctoral Scholar Julia Ticona and Data & Society Research Analyst Alexandra Mateescu co-authored an op-ed for Fast Company about the safety of workers who rely on digital platforms to stay employed.
“For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.”
Slate | 03.18.18
Data & Society Researcher Jacob Metcalf co-authored an op-ed in Slate discussing how giving researchers more access to Facebook users’ data could prevent unethical data mining.
“This case raises numerous complicated ethical and political issues, but as data ethicists, one issue stands out to us: Both Facebook and its users are exposed to the downstream consequences of unethical research practices precisely because like other major platforms, the social network does not proactively facilitate ethical research practices in exchange for access to data that users have consented to share.”
Chalkbeat | 11.09.17
D&S research Claire Fontaine explores issues around accessibility at NYC schools.
“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.
But for some families, the choice process begins and ends with the question: Is the building fully accessible?”
New York Magazine | 10.31.17
D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.
“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”
D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.
What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.
Slate | 02.28.17
D&S lawyer-in-residence Rebecca Wexler analyzes the unreliability of video authenticating in Slate.
When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.
The Guardian | 02.02.17
D&S affiliate Keith Hiatt, Michael Kleinman, and D&S researcher Mark Latonero think critically about the usage of technology as a all-encompassing solution in human rights spaces.
It’s important to acknowledge that, most of the time, the underlying problem human rights organisations are trying to solve isn’t technical. It’s often a bureaucratic, institutional, process or workflow problem, and technology won’t solve it (and might exacerbate it).
D&S affiliate Mimi Onuoha details the process of completely deleting data.
This overwriting process is a bit like painting a wall: If you start with a white wall and paint it red, there’s no way to erase the red. If you want the red gone or the wall returned to how it was, you either destroy the wall or you paint it over, several times, so that it’s white again.
D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.
The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.
D&S affiliate Kate Crawford writes a letter to Silicon Valley on how to resist Trump.
You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment.
The Guardian | 01.13.17
D&S founder danah boyd’s Points piece was re-published for The Guardian. boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.
D&S advisor Charlton D. McIlwain writes an op-ed about the new white political movement.
We must see the new white narrative for what it really is, an attempt to refocus public attention and political capital away from people of color. Trump and many of those he’s chosen to lead his administration are the new white’s principal ambassadors. They take stock of the last few years as blacks fought against police brutality, Muslims battled religious persecution, and Hispanics defended themselves and their families from mass deportations. These representatives of the new white respond: Your concerns don’t matter as much as working-class folk (white people) for whom America’s promise was designed but has been denied.
Globalvoices.org | 12.19.16
D&S fellow Zara Rahman donates part of her Shuttleworth Flash Grant to the Human Rights Data Analysis Group and Global Voices.
So here’s the thing: I don’t think we need only innovative ideas or world-changing projects. We also need trust, communities, and skills. We need to strengthen and support existing infrastructure and communities. I worry that we’ve become far too fixated upon quickly implemented innovation and disruption, and that we’re taking a lot of important things for granted—things we rely upon that, unlike “innovative ideas”, take a lot of time and effort to build.
Quartz | 12.04.16
D&S affiliate Mimi Onuoha profiles the Asian American Performers Action Coalition (AAPAC) of Broadway and off-Broadway and their efforts “to track racial demographic data in the industry.”
D&S advisor Ethan Zuckerman wrote an op-ed detailing the issue of normalizing the abnormal.
My deep fear is that there’s no single set of Hallin’s spheres anymore. What’s consensus to a Trump supporter may be deviant to a Clinton supporter and vice versa. We now face an online media landscape so diverse and fragmented that each of us finds big enough spheres of legitimate controversy that we think we’re seeing a real debate at work.
Quartz | 11.24.16
D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.
In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.
The New York Times | 11.22.16
D&S researcher Robyn Caplan writes about fake news and Facebook in the NYT.
Media organizations were especially left behind when Facebook changed its algorithm in June to privilege friends and family over major publishers, which happens to have occurred just prior to the spike in fake news before the election. Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm.
Vox | 11.17.16
D&S advisor Baratunde Thurston jumps into the debate of the place of empathy in a post-Trump election.
MIT Technology Review | 11.17.16
D&S affiliate Sorelle Friedler, with Nicholas Diakopoulos, discuss five principles to hold algorithmic systems accountable.
Recent investigations show that risk assessment algorithms can be racially biased, generating scores that, when wrong, more often incorrectly classify black defendants as high risk. These results have generated considerable controversy. Given the literally life-altering nature of these algorithmic decisions, they should receive careful attention and be held accountable for negative consequences.
Quartz | 11.12.16
D&S affiliate Gideon Lichfield wrote this op-ed about the recent US presidential election
MIT Technology Review | 07.11.16
D&S advisor Ethan Zuckerman defends usage of video recording of police officers.
If video doesn’t lead to the indictment of officers who shoot civilians, are we wrong to expect justice from sousveillance? The police who shot Castille and Sterling knew they were likely to be captured on camera—from their police cars, surveillance cameras, and cameras held by bystanders—but still used deadly force in situations that don’t appear to have merited it. Is Mann’s hope for sousveillance simply wrong?
Not quite. While these videos rarely lead to grand jury indictments, they have become powerful fuel for social movements demanding racial justice and fairer policing. In the wake of Sterling and Castille’s deaths, protests brought thousands into the streets in major U.S. cities and led to the temporary closure of interstate highways.
D&S researcher Madeleine Clare Elish discusses the complexities of error in automated systems. Elish argues that the human role in automated systems has become ‘the weak link, rather than the point of stability’.
We need to demand designers, manufacturers, and regulators pay attention to the reality of the human in the equation. At stake is not only how responsibility may be distributed in any robotic or autonomous system, but also how the value and potential of humans may be allowed to develop in the context of human-machine teams.
D&S Researcher Robyn Caplan considers whether Facebook is saving journalism or ruining it:
The question of whether Facebook is saving or ruining journalism is not relevant here because, like it or not, Facebook is a media company. That became more apparent recently as human editors became a visible part of Facebook’s news curation process. In truth, this team is only a tiny fraction of a network of actors whose decisions affect the inner workings of Facebook’s platform and the content we see.
New York Times | 02.18.16
D&S Advisor Joel Reidenberg considers the scope of the court order compelling Apple to provide “reasonable technical assistance” to help the government hack into one of the San Bernadino attacker’s locked iPhone.
In short, for government to legitimately circumvent device encryption through a court order, legal authorization to access the contents of the device (typically through a judicial warrant) is necessary. Then, if the equipment manufacturer has control over the encryption, the decryption should be performed by the manufacturer with the results provided to the government.
If, instead, the equipment manufacturer only has control over information necessary to decrypt the device, the information should be provided to the government under strict court seal and supervision for a one-time limited use.
If neither circumstance applies, then unless Congress says otherwise, the equipment manufacturer should not be compelled to assist.
The bottom line is that the government should have an ability to compel companies to unlock encrypted devices for access to evidence of crimes, but should not be able to force companies to build electronic skeleton keys, new access tools and security vulnerabilities.
New York Times | 11.19.15
In this op-ed, Data & Society fellow Seeta Peña Gangadharan addresses the question “Can Crime Be Ethically Predicted?” and argues that bias is inherent in the technical systems used in predictive policing leading to “fundamentally discriminatory” technologies.
Christian Science Monitor | 11.13.15
In this op-ed, D&S advisor Deirdre Mulligan argues:
Whether it’s Britain or the US insisting that tech companies develop a mechanism to give police or spy agencies access to encrypted communications, backdoors put everyone at risk.
Mulligan highlights the difficulty in controlling access to the backdoors that policy advocates are pushing for, citing instances of researchers seizing control of vehicles. In addition to the assertion by the vice chairman of the Joint Chiefs of Staff, that the United States is especially vulnerable to the attacks that backdoors could facilitate due to our dependence on web-enabled devices.
blog post | 10.01.15
D&S fellow Mark Latonero and D&S researcher Monica Bulger participated in a joint statement prepared by the Centre for Justice and Crime Prevention (CJCP) Prof. Sonia Livingstone, Coordinator of EU Kids Online, Data & Society Research Institute, Child Rights International Network (CRIN) and the International Child Redress Project (ICRP). It seeks to contribute a children’s rights perspective to existing discussions and policies concerning child sexual exploitation in relation to information and communication technologies (ICTs).
While existing efforts to protect children online are well-intended, in practice, they might inadvertently infringe upon children’s other rights. Responses to date both at a policy level and within the household are largely based on perceptions of the dangers of the Internet, rather than the evidence. It is undoubtedly important to recognize the need for protection; but in devising responses to the risk of child sexual exploitation we must consider a) the range of other rights potentially undermined by exclusively protectionist policies, b) the evidence for specific risks of harm to particular groups, and then promote c) evidence-based interventions. Policies designed to address sexual exploitation in relation to ICTs
should respect the full set of rights enshrined in the UN Convention on the Rights of the Child.
Particular care is needed to ensure that any one right (e.g. protection from sexual exploitation) is not seen to automatically justify restrictions on other rights (e.g. right to privacy, to access information, and to freedom of expression). In the face of rights violations as a result of inappropriate protection policies, children should have access to justice.
In this op-ed, Data & Society founder danah boyd argues that youth are addicted to interacting with their peers, not to screens. “If Americans truly want to reduce the amount young people use technology,” she writes, “we should free up more of their time.”
In this piece for Civic Hall’s Civicist, Samuel Woolley and D&S fellow Tim Hwang argue that “[t]he failure of the ‘good bot’ is a failure of design, not a failure of automation” and urge us not to dismiss the potential benefits of bots.
New York Times | 09.30.14
In this op-ed, Data & Society fellow Seeta Peña Gangadharan argues that privacy-as-default features “help restore public trust in technology as a tool to improve our lives and collectively self-govern.” “When technologies come to market with security and privacy baked in,” she writes, “they help users navigate an increasingly opaque digital landscape.”
Nude pictures of celebrities stolen from their own iCloud accounts. Facebook experimenting with the emotions in their users’ feeds. Google reading Gmail before their users do. Fitness trackers without privacy policies, vulnerable to security breaches, and bait-and-switch tactics to sell customers’ data. Almost every day there is a story about the gap between the expectations people have for their own data and what companies actually do with that data. To fix this gap, we first need to rethink the nature of data.
In this op-ed D&S advisor Gina Neff confronts a root cause of some of the most controversial and upsetting technology-related incidents in recent memory- the way we think about data. Using research she conducted on how data might transform health care, Neff offers lessons on how we companies can begin having productive conversations with users about their data.
TIME | 09.02.14
In this op-ed, Data & Society advisor Janet Vertesi highlights a systemic tension in our current understanding of data privacy: “We are all supposed to be solely responsible for our personal information, but at the same time we are all part of a social network of family, friends and services with whom we are expected to share.” She argues that personal data online is interpersonal data and that data privacy is a collective responsibility.
New York Times | 08.07.14
In this op-ed, Data & Society fellow Seeta Peña Gangadharan argues that the “rise of commercial data profiling is exacerbating existing inequities in society and could turn de facto discrimination into a high-tech enterprise.” She urges us to “respond to this digital discrimination by making civil rights a core driver of data-powered innovations and getting companies to share best practices in detecting and avoiding discriminatory outcomes.”
Los Angeles Times | 07.15.14
In this op-ed, Data & Society fellow Karen Levy discusses mandating electronic monitoring of truck drivers as a way to address unsafe practices in trucking. She argues that “electronic monitoring is an incomplete solution to a serious public safety problem. If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.”
Los Angeles Times | 07.15.14
D&S fellow Karen Levy writes about the nation’s trucking system and the need for reform. Based on her three years of research around trucker’s compliance with federal regulations, she argues the changes must address root economic causes underlying a range of unsafe practices and that electronic monitoring is an incomplete solution to a serious public safety problem.
Truckers don’t work without sleep for dangerously long stretches (as many acknowledge having done) because it’s fun. They do it because they have to earn a living. The market demands a pace of work that many drivers say is impossible to meet if they’re “driving legal.”
If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.
TIME | 05.01.14
In this op-ed, Data & Society advisor Janet Vertesi argues that opting out of personal data collection is not a simple or straightforward way to have privacy from marketers and technology companies. To demonstrate the high price of opting out, she discusses a personal experiment in which she attempted to conceal her pregnancy from the “bots, trackers, cookies and other data sniffers online that feed the databases that companies use for targeted advertising.” “When it comes to our personal data,” she writes, “we need better choices than either ‘leave if you don’t like it’ or no choice at all.”
Video of the Theorizing the Web panel in which Janet Vertesi presented her opt-out experiment: