Uncategorized

Onward

Data & Society was founded on an awareness that data-intensive technologies are powerful and profoundly reconfiguring society in ways that are, ultimately, political. As a research organization, we’re committed to understanding what’s at stake and informing stakeholders, making visible the trade-offs and values that are embedded in decisions around the development and adoption of technology.

In the wake of the US presidential election, our purpose is even more pressing. We believe that research can serve as a check to power — and an important check in this moment, as we know that data-intensive technologies are used both to seek and obscure truth.

While Data & Society is not an advocacy organization, our values are rooted in a commitment to social justice and in a belief that, as researchers and educators, our mandate is to understand sociotechnical systems and communicate our findings so that we may collectively address bias and discrimination and develop possibilities for accountability.

The technologies we study are woven into every area of society, increasingly so. As a result, in our short tenure, we have worked with people representing many different, and often opposing, political and ideological backgrounds. We take pride in helping people to identify the differences among themselves and to make a commitment to using grounded knowledge to seek solutions. We have found that, despite differences in approach, those we work with share a commitment to civil rights, the rule of law, and a fair and equitable society.

Last week’s election has left many in our organization and broader community uncertain and anxious, both personally and professionally. We welcome political disagreement, but we are not willing to see hate speech and cruelty validated. We fear for vulnerable individuals and communities who are now under increased threat because of the color of their skin or place of their birth. And we decry the misogyny that has been present throughout the president-elect’s campaign.

As social scientists, we’re struggling with knowing the cost of anxiety and inequality on the health of a society and with having some purchase on the part played by new technologies in social dynamics. As computer scientists, we’re grappling with how much we know about how statistics and data-driven systems can be used to exclude, segment, and misinform. As historians, we can’t help but think back to periods when data was used to target particular religious and ethnic communities and to engineer systems that optimize efficiency over equity. As legal scholars, we’re troubled by how much we know about how limited the law is in protecting people’s privacy in the face of digital surveillance.

We have long argued that fear and hype are counterproductive, even though conversations around technology very often begin there. A commitment to knowledge is more important now than ever. No technology is neutral or free of its social context. No information is neutral. No reporting is neutral.

We owe it to ourselves and to the public to make certain that we inform, educate, and empower people who continue to work towards a world that is just and fair, ethical and responsible. That has always been our goal as a research institute, and, in the wake of this election, it is more critical than ever.

We invite you to join us.

In addition to research, Data & Society’s goal is to help grow a network that can anticipate issues and offer insight and direction. Let us know what you’re working on, what issues you’re digging into; we’re listening at: onward at datasociety dot net.

—Data & Society