Announcement

Data & Society Announces Eleven New Affiliates

May 24, 2023—Data & Society (D&S) is pleased to welcome eleven new affiliates, adding to an existing group of researchers, thinkers, and advocates who regularly collaborate with and lend their insight to the organization. The affiliates program is a key avenue for D&S to work in partnership with members of its extended network, all of whom share a vision of a future in which data-centric technologies are grounded in equity and human dignity. 

“The affiliates program at Data & Society is an important manifestation of our value of mutuality,” said Ania Calderon, the organization’s managing director of strategy and engagement. “We value reciprocal relationships, and look forward to nurturing deeper relationships with these affiliates and continuing to learn from each other’s expertise.” 

Affiliates may be involved in a funded research project at Data & Society or actively explore or develop future projects and engagements. They are encouraged to participate in events and programming at D&S outside their main research activities, and receive access to D&S’s Slack community, where they can share and view opportunities, participate in conversations, and contribute ideas and happenings. Previous and current affiliates have produced reports and primers, and participated in events hosted by D&S.

The new affiliates are:

Robyn Caplan is currently a senior researcher at Data & Society and a founding member of the Platform Governance Research Network. In July 2023, she will be joining Duke University’s Sanford School of Public Policy and Center for Science & Society as assistant professor of tech policy. Robyn conducts research at the intersection of platform governance and media policy, examining the impact of inter-and-intra-organizational behavior on platform governance and content moderation. Robyn will assume her role as an affiliate in July 2023.

Kim Fernandes is a joint doctoral candidate in anthropology and education at the University of Pennsylvania. As a researcher, writer, and educator, they are interested in how the body meets and moves through the world. Their current research, emerging from their dissertation project, lies at the intersections of disability, data, and governance. Through this project, they are particularly focused on the processes of enumerating and identifying disability in urban India.

Alyx Goodwin is deputy campaigns director at the Action Center on Race & the Economy (ACRE). She organizes with BYP100 (Black Youth Project 100) Chicago and works with a number of campaigns that seek to chip away at local policing apparatus. Her writing and activism are centered around the momentum and challenges of building Black power and self-determination. Alyx’s work focuses on the relationship between the finance industry and policing, racialized capitalism, and how these things exacerbate oppressions.

Zehra Hashmi researches identification technologies in South Asia and their intersection with surveillance, kinship, and governance. Her current book project is a historical ethnography of Pakistan’s national identity database, following how this information system uses data as a kin-making substance to redefine who counts as kin, and ultimately, a citizen. Zehra’s work brings an anthro-historical understanding to bear on debates concerning a central feature of life today: digital identification. She is an assistant professor in the department of history and sociology of science at the University of Pennsylvania and holds a PhD in anthropology and history from the University of Michigan.

Amanda Lenhart is head of research at Common Sense Media. Prior to joining Common Sense, she was the program director for Data & Society’s Health + Data team, leading research on health-related surveillance of essential workers, Silicon Valley’s myths about healthy tech, and how social media company workers think about and enact digital well-being for kids in the product design process. Previously, Amanda was the deputy director for the Better Life Lab at New America, and a senior research scientist at a collaboration between the Associated Press and NORC. She started her career at the Pew Research Center, where she helped to found the internet team and pioneered and led research on teens and families for 16 years. 

Elena Maris is an assistant professor in the department of communication at the University of Illinois at Chicago. Her research is concerned with media, technology, and society. She is particularly interested in the ways media/tech industries and their audiences or users try to understand and influence one another. Elena also studies how people experience and leverage their identities through popular culture and the internet. She has expertise in user/industry relations, privacy, the politics of platforms and AI, audience measurement and social media metrics, digital fandom and online communities, and marginalized users. 

Emanuel Moss is a sociotechnical systems research scientist at Intel Labs, where he is studying the social practices of integrated circuit design, responsible AI techniques, and the role of hardware in AI development. Previously, he was a researcher for Data & Society’s AI on the Ground initiative, where he conducted empirical research on the formation of AI ethics and developed a program of study focused on algorithmic accountability. He also studies the role of machine learning in knowledge production and the social dimensions of machine intelligence. 

Samir Passi is a social scientist with an interdisciplinary background in information science, AI, and STS. He works at the intersection of AI practice and research, engaging with AI practitioners to investigate the sociotechnical, organizational, and ethical aspects within AI development. He is a researcher at Kforce, through which he works as a researcher for Microsoft’s AI Ethics and Effects in Engineering and Research (AETHER) team. There, he focuses on the challenges and opportunities for responsible AI design and development.

Dibyadyuti Roy is a lecturer and assistant professor in cultural studies, media studies, and digital humanities, as well as the programme director of the BA in cultural and media studies at the University of Leeds. With experience as an educator in multidisciplinary academic environments across three different continents, Dibya’s research and teaching examines cultural narratives of/about dominant technologies, ranging from nuclear weapons to artificial intelligence, with a particular focus on algorithmically driven platforms. He is a founding member of India’s first digital humanities collective, the Digital Humanities Alliance for Research and Teaching Innovations (DHARTI), and a member of the Alliance of Digital Humanities Organizations’ Intersectional Inclusion Task Force.

Murali Shanmugavelan is a researcher with the Fairwork project at the Oxford Internet Institute, University of Oxford. His academic research is concerned with the disavowal of protected categories such as caste, race, gender, and sexual orientation in media and communication studies and digital cultures. His research at Fairwork builds on this academic training and activism to scrutinize and mitigate re-manifestations of digital inequalities in platform economies.

Hannah Zeavin is a scholar, writer, and editor whose work centers on the history of human sciences (psychoanalysis, psychology, and psychiatry), the history of technology and media, feminist science and technology studies, and media theory. Starting in July 2023, Hannah will be an assistant professor of the history of science in the department of history and the Berkeley Center for New Media at UC Berkeley.

Affiliates are formally nominated by current D&S staff, and approved by senior leadership. Their status is reviewed on a yearly basis.

Read all the affiliates’ full bios here.

About Data & Society

Data & Society is an independent nonprofit research organization, studying the social implications of data-centric technologies and automation. We recognize that the same innovative technologies that may benefit society can also be abused to invade privacy, provide new tools of discrimination, foreclose opportunity and harm individuals and communities. Through original research and inclusive engagement, we work to ensure that empirical evidence and respect for human dignity guide how technology is developed and governed.