Imagining Alternatives

Insights from the Data for Black Lives Conference

Anuli Akanegbu reflects on the connections between data practices and systemic racism, and offers resolutions to guide work that prioritizes the humane over the artificial.

January 15, 2025

As a cultural anthropologist who leads research on labor, race, and technology for Data & Society’s Labor Futures program, my work has always been concerned with reclaiming and reframing narratives that present Black people as producers of knowledge and data, not just consumers. Near the end of 2024, I accepted an invitation by the Data for Black Lives movement to connect and collaborate with a dynamic group of activists, artists, organizers, and scientists (which included my Data & Society colleagues) at the Perez Art Museum in Miami. It was a gathering of bodies, minds, and spirits where the goal was to “decompress.” The purpose of decompression, Data for Black Lives Founder and CEO Yeshimabeit Milner explained on the organization’s website, is “to find alternatives.” Taking a respite from our everyday routines not only allows us to breathe and relieve pressure, but to reinvent ourselves and envision new worlds.

The panels, workshops, and creative programming at this year’s Data for Black Lives conference, the organization’s third, enabled enlightening discussions about the connections between data practices and systemic racism that resonated with me deeply. Together, they amounted to a set of recommendations — or resolutions — that will help guide my work in 2025, and I hope might inspire yours.

Recognize Bias in AI and Understand How It Operates

Underscoring the historical and systemic injustices that affect data representation, criminologist and data science professor Renee Cummings uses a trauma-informed approach to center Black experiences in discussions about technology and data. Central to this approach, she explained during the opening plenary, is the question of whether we are using new technologies to modernize old racial typologies. “Data sets do not forget,” Cummings noted. 

AI and other automated technologies can exacerbate existing inequalities in employment, education, health, and housing, to name just a few areas. As Katya Abazajian, founder of the Local Data Futures Initiative, explained in a panel on housing, “All data is biased. There’s no data without bias, there’s just data with context.” Panelist Rasheedah Phillips, director of housing for PolicyLink, added, “Bias is not limited to bad actors, it’s embedded in the systems we use.” 

Reorient Approaches to Center Community

Throughout the conference, speakers underscored the importance of physical spaces for community engagement. Arts institutions and churches, for example, were hailed as hubs for organizing, education, and technology access that not only foster a sense of safety and belonging, but enable community members to meaningfully engage with technology. “What if we centered our historic, old institutions instead of tech companies and made them feel safe?” Dr. Fallon Wilson, co-founder of the #BlackTechFutures research institute asked in her opening keynote. “We need physical brick and mortar institutions, visible spaces where we can breathe, reflect and assure that all Black communities not just have access to the internet, but have the freedom to thrive.” 

Prioritizing community needs in data governance was another recurring theme. When it comes to the development and implementation of AI technologies, critical public interest technologist Dr. Jasmine McNealy recommended working with communities to craft their own data trusts, so the community can control how their data is collected, used, and shared. There was also broad agreement on the need to establish accountability mechanisms for companies that use AI, so that they are held responsible for the impacts of their technologies on marginalized communities. 

Reframe Research Questions and Experiment with Creative Methods

Speaking on a health-focused panel, “Our Bodies Keep Score,” Data & Society Research Analyst Joan Mukogosi highlighted that amid so much data about Black dying, more data is needed about Black living. In the same session, medical anthropologist Dr. Chelsey R. Carter explained that a shift in perspective can lead to more constructive and solution-oriented research. She recommended reframing the focus of research questions from deficit-based inquiries (asking, for example, why certain groups have worse outcomes) to asset-based inquiries (asking instead about what factors contribute to better outcomes for other groups) as a way to identify systemic issues rather than perceived individual shortcomings. 

Dr. AJ Christian, a communications scholar, introduced the concept of ancestral intelligence as an alternative to artificial intelligence, reflecting, ”I went back to my ancestry and found my method.” Ancestral intelligence, he explained, is a means to address and heal from the impacts of capitalist systems by centering Black experiences in discussions about technology and data. Dr. Christian encouraged us to reflect on our connection to nature, explaining that feelings of anxiety arise when people believe that humans are separate from the natural world rather than dependent on it. 

Multiple speakers emphasized the value of creative storytelling as a way to understand and interpret data, and to humanize and advocate for more equitable practices, advocating for sharing personal narratives that illustrate the real-life impacts of data trends. Noting that “the personal is political,” AI ethicist and PhD candidate Marie-Therese Png stressed the value of coming to research from a personal perspective and encouraged researchers to reflect on their own experiences and consider how they relate to broader societal issues. 

In her closing keynote, Dr. Ruha Benjamin emphasized the transformative power of abundant imagination in shaping a more equitable future. The future, she said, is a reflection of our current choices, and we will need to bridge the critical and the creative in order to build “us-topias.” She encouraged us to take the imagination seriously as a muscle, and to use it as a resource to seed what we want rather than just uprooting what we don’t. 

As we begin a new year, these insights from the Data for Black Lives conference serve as a clarion call for action and a reminder to decompress. Let us take these recommendations to heart, prioritizing the humane over the artificial as we strive to build a better world through our individual and collective data practices in 2025 and beyond.