Announcement

Trust Issues

Perspectives on Community, Technology, and Trust

Call for Proposals

March 21 and 28, 2024
Hosted Online

Theme

Trust, it seems, is in decline: In the United States, trust in religious institutions, Congress, banks, and the media — to name just a few examples — has been decreasing at least since the General Social Survey began measuring Americans’ confidence in these institutions in the 1970s. Across the world, research shows that levels of trust in government processes vary widely according to context and political formation. And trust in the technology industry, oversold to us thanks to the prestige of quantification, has experienced a precipitous downturn over the last several years, even compared to other government, nonprofit, and commercial institutions.

Perhaps because of this, there is a growing focus on building trust in media, in government, and in AI systems. When it comes to data-centric technologies, this raises important questions, including: Can trust be built into systems that users have determined to be untrustworthy? Should we be thinking of trust as something that is declining or improving, something to be built into AI and other data-centric systems, or as something that is produced through a set of relations and in particular locations? Where else, besides large institutions and their technologies, is trust located? How do other frames of trust produce community-centered politics such as politics of refusal or data sovereignty? What can community-based expertise tell us about how trust is built, negotiated, and transformed within and to the side of large-scale systems? Is there a disconnect between the solutions to a broad lack of trust and how social theorists, community members, and cultural critics have thought about trust?

Trust is deeply relational (Scheman 2020, Knudsen et al, 2021, Baier 1986), and has been understood in terms of the vulnerabilities inherent in relationships (Mayer et al 1995). Yet discussions about trust in AI systems often reveal a lack of understanding of the communities whose lives they touch — their particular vulnerabilities, and the power imbalances that further entrench them. Some populations are expected to simply put their trust in large AI systems. Yet those systems only need to prove themselves useful to the institutions deploying them, not trustworthy to the people enmeshed in their decisions (Angwin et. al 2016, O’Neill 2018; Ostherr et. al 2017).  At the same time, researchers often stop upon asking whether we can trust algorithms, instead of extending the question of trust to the institutions feeding data into or deploying these algorithms. 

This workshop will examine alternative formulations of trust, data, and algorithmic systems by widening the frame on the contexts, approaches, and communities implicated in them.  

 

What We're Looking For

This academic workshop will bring together those who are investigating trust and digital technologies from various angles, disciplinary traditions, and global perspectives. We will learn how different disciplines and epistemologies understand trust, with the goal of developing theories that specify how trust — as well as mistrust — shape how data-centric technologies unfold and should unfold. In our work together, we aim to move away from a concept of trust that is inherent to the object (e.g. information as trustworthy) or a concept of trust that is overly normative (prescribing trust as a goal that should be achieved), and toward a concept of trust as a relational process. We will work toward an empirical grounding of how trust is stymied, broken, established, reestablished, co-opted, and redirected among the powerful and among communities who have never been able to fully trust the institutions that shape their lives.

We hope this workshop will appeal to scholars of trust and mistrust, anthropology, sociology, STS, media and communication, and law, with a special focus on Indigenous technologies, race and medicine, and the majority world. We also encourage the participation of practitioners, creative designers, technologists, and community members as they think about trust outside the confines of academic scholarship.

Focus

We invite participants to own a perspective on trust and its relationship to emerging technologies, and present it with the goal of bridging these different understandings of trust. Choosing from a series of categories, participants will provide one canonical piece, one recent formulation, and one example or case study (published or unpublished). Categories will include: experts, communities, quantification, intelligence, relationality, and embodiment. A review committee will select participants from an open call. 

The workshop will begin with a day dedicated to discussing the concept of trust as explored through the submissions. Participants will then be asked to reflect for one week on the discussion and draft a presentation of their collective formulation of trust. The workshop will resume for a second day-long session where participants will present their trust concept while an artist creates a live illustration reflecting each concept. The workshop will culminate in a written collection of 1500-word essays.

How to Apply

APPLICATION PERIOD HAS CLOSED

If you are interested in attending, please submit the following information HERE

  • Name, email address, affiliation, title.
  • Bio or link to work.
  • Your three favorite books/papers/websites related to the topic of the workshop.
  • A 250–300 word project/contribution description. How does your work contribute to reframing trust and technology?

Please contact: [email protected] with any questions.

Key Dates

Application deadline: January 19, 2024

Selection notification: February 16, 2024

RSVP deadline: February 22, 2024

Event Program Circulation: Week of March 4, 2024

Workshop Day 1: March 21, 2024

Workshop Day 2: March 28, 2024

Acknowledgments

“Trust Issues” is organized by the Trustworthy Infrastructures team of Sareeta Amrute, Livia Garofalo, Robyn Caplan, Joan Mukogosi, Tiara Roxanne, and Kadija Ferryman, with additional support from Tunika Onnekikami. Visit the Trustworthy Infrastructures research page to learn more.

This workshop is part of our exploration of public participation, curated by Participatory Methods Researcher Meg Young. In a piece on Points, she and D&S collaborators explain more about what our exploration looks like and why participatory methods are so important.