featured


filtered by: op-ed


The Guardian | 06.01.18

The Case for Quarantining Extremist Ideas

danah boyd, Joan Donovan

Data & Society President and Founder danah boyd and Media Manipulation Research Lead Joan Donovan challenge newsrooms to practice “strategic silence” to avoid amplifying extremist messaging.

“Editors used to engage in strategic silence – set agendas, omit extremist ideas and manage voices – without knowing they were doing so. Yet the online context has enhanced extremists’ abilities to create controversies, prompting newsrooms to justify covering their spectacles. Because competition for audience is increasingly fierce and financially consequential, longstanding newsroom norms have come undone. We believe that journalists do not rebuild reputation through a race to the bottom. Rather, we think that it’s imperative that newsrooms actively take the high ground and re-embrace strategic silence in order to defy extremists’ platforms for spreading hate.”


For Slate, Data & Society Researcher Jacob Metcalf argues that we should be more concerned about behavioral models developed by entities like Cambridge Analytica, which can be traded between political entities, rather than the voter data itself.

” In other words, the one thing we can be sure of psychographic profiling is that it provided one more way to transfer knowledge and economic value between campaigns and organizations.”


In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.

“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”


Data & Society Postdoctoral Scholar Julia Ticona and Data & Society Research Analyst Alexandra Mateescu co-authored an op-ed for Fast Company about the safety of workers who rely on digital platforms to stay employed.

“For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.”


Data & Society Researcher Jacob Metcalf co-authored an op-ed in Slate discussing how giving researchers more access to Facebook users’ data could prevent unethical data mining.

“This case raises numerous complicated ethical and political issues, but as data ethicists, one issue stands out to us: Both Facebook and its users are exposed to the downstream consequences of unethical research practices precisely because like other major platforms, the social network does not proactively facilitate ethical research practices in exchange for access to data that users have consented to share.”


D&S research Claire Fontaine explores issues around accessibility at NYC schools.

“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.

But for some families, the choice process begins and ends with the question: Is the building fully accessible?”


D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.

“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”


Washington Monthly | 06.13.17

Code of Silence

Rebecca Wexler

D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.

What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.


D&S lawyer-in-residence Rebecca Wexler analyzes the unreliability of video authenticating in Slate.

When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.


D&S affiliate Keith Hiatt, Michael Kleinman, and D&S researcher Mark Latonero think critically about the usage of technology as a all-encompassing solution in human rights spaces.

It’s important to acknowledge that, most of the time, the underlying problem human rights organisations are trying to solve isn’t technical. It’s often a bureaucratic, institutional, process or workflow problem, and technology won’t solve it (and might exacerbate it).


D&S affiliate Mimi Onuoha details the process of completely deleting data.

This overwriting process is a bit like painting a wall: If you start with a white wall and paint it red, there’s no way to erase the red. If you want the red gone or the wall returned to how it was, you either destroy the wall or you paint it over, several times, so that it’s white again.


Deutsche Welle | 01.25.17

Fake news is a red herring

Ethan Zuckerman

D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.

The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.


Harper’s Bazaar | 01.23.17

Trump: A Resister’s Guide

Kate Crawford

D&S affiliate Kate Crawford writes a letter to Silicon Valley on how to resist Trump.

You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment.

 


D&S founder danah boyd’s Points piece was re-published for The Guardian. boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.


D&S advisor Charlton D. McIlwain writes an op-ed about the new white political movement.

We must see the new white narrative for what it really is, an attempt to refocus public attention and political capital away from people of color. Trump and many of those he’s chosen to lead his administration are the new white’s principal ambassadors. They take stock of the last few years as blacks fought against police brutality, Muslims battled religious persecution, and Hispanics defended themselves and their families from mass deportations. These representatives of the new white respond: Your concerns don’t matter as much as working-class folk (white people) for whom America’s promise was designed but has been denied.


D&S fellow Zara Rahman donates part of her Shuttleworth Flash Grant to the Human Rights Data Analysis Group and Global Voices.

So here’s the thing: I don’t think we need only innovative ideas or world-changing projects. We also need trust, communities, and skills. We need to strengthen and support existing infrastructure and communities. I worry that we’ve become far too fixated upon quickly implemented innovation and disruption, and that we’re taking a lot of important things for granted—things we rely upon that, unlike “innovative ideas”, take a lot of time and effort to build.


D&S affiliate Mimi Onuoha profiles the Asian American Performers Action Coalition (AAPAC) of Broadway and off-Broadway and their efforts “to track racial demographic data in the industry.”


D&S advisor Ethan Zuckerman wrote an op-ed detailing the issue of normalizing the abnormal.

My deep fear is that there’s no single set of Hallin’s spheres anymore. What’s consensus to a Trump supporter may be deviant to a Clinton supporter and vice versa. We now face an online media landscape so diverse and fragmented that each of us finds big enough spheres of legitimate controversy that we think we’re seeing a real debate at work.


D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.

In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.


D&S researcher Robyn Caplan writes about fake news and Facebook in the NYT.

Media organizations were especially left behind when Facebook changed its algorithm in June to privilege friends and family over major publishers, which happens to have occurred just prior to the spike in fake news before the election. Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm.


D&S advisor Baratunde Thurston jumps into the debate of the place of empathy in a post-Trump election.


MIT Technology Review | 11.17.16

How to Hold Algorithms Accountable

Nicholas Diakopoulos, Sorelle Friedler

D&S affiliate Sorelle Friedler, with Nicholas Diakopoulos, discuss five principles to hold algorithmic systems accountable.

Recent investigations show that risk assessment algorithms can be racially biased, generating scores that, when wrong, more often incorrectly classify black defendants as high risk. These results have generated considerable controversy. Given the literally life-altering nature of these algorithmic decisions, they should receive careful attention and be held accountable for negative consequences.


D&S affiliate Gideon Lichfield wrote this op-ed about the recent US presidential election


D&S founder danah boyd wrote a critical piece entitled ‘Why Social Science Risks Irrelevance’ questioning whether social science as an academic field is headed in the right direction.

When we believe that we have a monopoly on asking important questions, we do ourselves a disservice. It’s dangerous that we think that basic research starts with the questions we find important rather than trying to understand the knowledge that society is missing. Peer review suggests that we are the only ones who have purchase over whether a study is valuable enough to fund or publish. True impact will never be achieved by trying to keep within an ivory tower. Impact requires being deeply embedded within the social world that we seek to understand and recognizing that the key to success is to inform and empower through knowledge.


D&S advisor Ethan Zuckerman defends usage of video recording of police officers.

If video doesn’t lead to the indictment of officers who shoot civilians, are we wrong to expect justice from sousveillance? The police who shot Castille and Sterling knew they were likely to be captured on camera—from their police cars, surveillance cameras, and cameras held by bystanders—but still used deadly force in situations that don’t appear to have merited it. Is Mann’s hope for sousveillance simply wrong?

Not quite. While these videos rarely lead to grand jury indictments, they have become powerful fuel for social movements demanding racial justice and fairer policing. In the wake of Sterling and Castille’s deaths, protests brought thousands into the streets in major U.S. cities and led to the temporary closure of interstate highways.


Slate | 06.16.16

Letting autopilot off the hook

Madeleine Clare Elish

D&S researcher Madeleine Clare Elish discusses the complexities of error in automated systems. Elish argues that the human role in automated systems has become ‘the weak link, rather than the point of stability’.

We need to demand designers, manufacturers, and regulators pay attention to the reality of the human in the equation. At stake is not only how responsibility may be distributed in any robotic or autonomous system, but also how the value and potential of humans may be allowed to develop in the context of human-machine teams.


D&S Researcher Robyn Caplan considers whether Facebook is saving journalism or ruining it:

The question of whether Facebook is saving or ruining journalism is not relevant here because, like it or not, Facebook is a media company. That became more apparent recently as human editors became a visible part of Facebook’s news curation process. In truth, this team is only a tiny fraction of a network of actors whose decisions affect the inner workings of Facebook’s platform and the content we see.


D&S Advisor Joel Reidenberg considers the scope of the court order compelling Apple to provide “reasonable technical assistance” to help the government hack into one of the San Bernadino attacker’s locked iPhone.

In short, for government to legitimately circumvent device encryption through a court order, legal authorization to access the contents of the device (typically through a judicial warrant) is necessary. Then, if the equipment manufacturer has control over the encryption, the decryption should be performed by the manufacturer with the results provided to the government.

If, instead, the equipment manufacturer only has control over information necessary to decrypt the device, the information should be provided to the government under strict court seal and supervision for a one-time limited use.

If neither circumstance applies, then unless Congress says otherwise, the equipment manufacturer should not be compelled to assist.

The bottom line is that the government should have an ability to compel companies to unlock encrypted devices for access to evidence of crimes, but should not be able to force companies to build electronic skeleton keys, new access tools and security vulnerabilities.


In this op-ed, Data & Society fellow Seeta Peña Gangadharan addresses the question “Can Crime Be Ethically Predicted?” and argues that bias is inherent in the technical systems used in predictive policing leading to “fundamentally discriminatory” technologies.


In this op-ed, D&S advisor Deirdre Mulligan argues:

Whether it’s Britain or the US insisting that tech companies develop a mechanism to give police or spy agencies access to encrypted communications, backdoors put everyone at risk.

Mulligan highlights the difficulty in controlling access to the backdoors that policy advocates are pushing for, citing instances of researchers seizing control of vehicles. In addition to the assertion by the vice chairman of the Joint Chiefs of Staff, that the United States is especially vulnerable to the attacks that backdoors could facilitate due to our dependence on web-enabled devices.


D&S fellow Mark Latonero and D&S researcher Monica Bulger participated in a joint statement prepared by the Centre for Justice and Crime Prevention (CJCP) Prof. Sonia Livingstone, Coordinator of EU Kids Online, Data & Society Research Institute, Child Rights International Network (CRIN) and the International Child Redress Project (ICRP). It seeks to contribute a children’s rights perspective to existing discussions and policies concerning child sexual exploitation in relation to information and communication technologies (ICTs).

While existing efforts to protect children online are well­-intended, in practice, they might inadvertently infringe upon children’s other rights. Responses to­ date ­ both at a policy level and within the household are largely based on perceptions of the dangers of the Internet, rather than the evidence. It is undoubtedly important to recognize the need for protection; but in devising responses to the risk of child sexual exploitation we must consider a) the range of other rights potentially undermined by exclusively protectionist policies, b) the evidence for specific risks of harm to particular groups, and then promote c) evidence­-based interventions. Policies designed to address sexual exploitation in relation to ICTs
should respect the full set of rights enshrined in the UN Convention on the Rights of the Child.

Particular care is needed to ensure that any one right (e.g. protection from sexual exploitation) is not seen to automatically justify restrictions on other rights (e.g. right to privacy, to access information, and to freedom of expression). In the face of rights violations as a result of inappropriate protection policies, children should have access to justice.


In this op-ed, Data & Society founder danah boyd argues that youth are addicted to interacting with their peers, not to screens. “If Americans truly want to reduce the amount young people use technology,” she writes, “we should free up more of their time.”


Civicist | 05.14.15

Bring on the Bots

Samuel Woolley, Tim Hwang

In this piece for Civic Hall’s Civicist, Samuel Woolley and D&S fellow Tim Hwang argue that “[t]he failure of the ‘good bot’ is a failure of design, not a failure of automation” and urge us not to dismiss the potential benefits of bots.


In this op-ed, Data & Society fellow Seeta Peña Gangadharan argues that privacy-as-default features “help restore public trust in technology as a tool to improve our lives and collectively self-govern.” “When technologies come to market with security and privacy baked in,” she writes, “they help users navigate an increasingly opaque digital landscape.”


Nude pictures of celebrities stolen from their own iCloud accounts. Facebook experimenting with the emotions in their users’ feeds. Google reading Gmail before their users do. Fitness trackers without privacy policies, vulnerable to security breaches, and bait-and-switch tactics to sell customers’ data. Almost every day there is a story about the gap between the expectations people have for their own data and what companies actually do with that data. To fix this gap, we first need to rethink the nature of data.

In this op-ed D&S advisor Gina Neff confronts a root cause of some of the most controversial and upsetting technology-related incidents in recent memory- the way we think about data. Using research she conducted on how data might transform health care, Neff offers lessons on how we companies can begin having productive conversations with users about their data.


In this op-ed, Data & Society advisor Janet Vertesi highlights a systemic tension in our current understanding of data privacy: “We are all supposed to be solely responsible for our personal information, but at the same time we are all part of a social network of family, friends and services with whom we are expected to share.” She argues that personal data online is interpersonal data and that data privacy is a collective responsibility.


In this op-ed, Data & Society fellow Seeta Peña Gangadharan argues that the “rise of commercial data profiling is exacerbating existing inequities in society and could turn de facto discrimination into a high-tech enterprise.” She urges us to “respond to this digital discrimination by making civil rights a core driver of data-powered innovations and getting companies to share best practices in detecting and avoiding discriminatory outcomes.”


In this op-ed, Data & Society fellow Karen Levy discusses mandating electronic monitoring of truck drivers as a way to address unsafe practices in trucking. She argues that “electronic monitoring is an incomplete solution to a serious public safety problem. If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.”


D&S fellow Karen Levy writes about the nation’s trucking system and the need for reform. Based on her three years of research around trucker’s compliance with federal regulations, she argues the changes must address root economic causes underlying a range of unsafe practices and that electronic monitoring is an incomplete solution to a serious public safety problem.

Truckers don’t work without sleep for dangerously long stretches (as many acknowledge having done) because it’s fun. They do it because they have to earn a living. The market demands a pace of work that many drivers say is impossible to meet if they’re “driving legal.”

If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.


In this op-ed, Data & Society advisor Janet Vertesi argues that opting out of personal data collection is not a simple or straightforward way to have privacy from marketers and technology companies. To demonstrate the high price of opting out, she discusses a personal experiment in which she attempted to conceal her pregnancy from the “bots, trackers, cookies and other data sniffers online that feed the databases that companies use for targeted advertising.” “When it comes to our personal data,” she writes, “we need better choices than either ‘leave if you don’t like it’ or no choice at all.”

Video of the Theorizing the Web panel in which Janet Vertesi presented her opt-out experiment:


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.