All the Lonely People

On Being Alone with Digital Companions

Researchers Livia Garofalo and Briana Vecchione reflect on what studying AI chatbots reveals about emotional attachment to machines, the power of illusion, and the norms and expectations that define care itself.

 

December 1, 2025

“Loneliness is difficult to confess.” — Olivia Laing, The Lonely City: Adventures in the Art of Being Alone 

When we began studying how people use large language models for companionship and emotional or therapeutic support, we often asked when during their day they turned to chatbots. Again and again, participants gave the same answer: “When I’m alone.”

In increasingly confessional admissions of loneliness, our participants described how they interacted with chatbots late at night or in the in-between moments of their day: after a work meeting, before a date, or during bouts of insomnia. For some, being alone was the challenge the chatbot was helping them overcome; for others, the chatbot functioned as a relief and refuge from life’s overstimulation and relentless expectations. 

Being alone often conjures the quiet ache of isolation: a roommate behind a closed door, a partner half-present on their phone, a workday filled with digital chatter but no real conversation, strangers passing by without a glance. But solitude is more than physical separation. It also means confronting the parts of ourselves that digital life simultaneously tries to expose and suppress. 

In The Lonely City, her memoir about living and walking alone in New York, Olivia Laing observes that “loneliness feels like such a shameful experience, so counter to the lives we are supposed to lead, that it becomes increasingly inadmissible, a taboo state whose confession seems destined to cause others to turn and flee.”

What safer confidant, then, than a chatbot — one that never flees, flinches, or turns away — to admit what feels otherwise impossible? 

The Talking Mirror

If loneliness is the condition, these systems become the mirror through which it speaks. People find chatbots appealing because they can express themselves while being in control of the interaction. Participants frequently described relief at not having to worry about judgment, misinterpretation, or emotional burden. One person said, “For me, it’s easier to talk to a robot…[because] I could talk to it almost nonstop. It helped to dissect and pick my own brain.”

In these accounts, the chatbot offers a private space for the feelings people struggle to share, something closer to a confessional than a conversation. It is akin to a journal that talks back. This “auto-intimacy,” as scholar Hannah Zeavin calls it, mirrors a broader technopolitics of the self. We use technologies to curate the conditions under which we can be honest. For users who have endured stigma, trauma, or the inaccessible and unequal healthcare system, the ability to speak without consequence can feel profound.

This control is also what makes the space fragile. While a chatbot can reflect the tone and boundaries the user sets, like a talking mirror, it cannot surprise, challenge, or comfort beyond what the prompt allows. Over time, the conversation can become an echo of one’s own scripts: therapeutic in form, but solipsistic in effect.

Being digitally alone, then, is not merely a social condition but a method of relating to oneself. In conversation with a chatbot, loneliness becomes visible as this existential condition is translated into text and archived, interaction after interaction. For some, this aids in the unfurling of patterns they could not otherwise name; it might surface feelings they would rather keep private. Technology mediates a fundamental, familiar act: talking to oneself via another. 

And what begins as a private exchange soon reveals itself as a social system, reflecting both the self as well as the architectures of isolation that shape us all.

Different Kinds of Companions

A now familiar late-capitalist ethos celebrates independence while quietly starving our need for connection. The measurement of worth is productivity instead of presence. Many of us spend our workdays in front of screens. Friendships are built and sustained through text bubbles. Private companies find ways to profit from our loneliness. Even therapy, for many, has migrated to telehealth video calls and apps.  

People in the United States are now eating alone more than ever before, sharing meals with one another more sporadically than they used to. One participant, who had spent time outside the US, noted that it was a particularly American set of conditions that fomented their need for chatbot interaction:

“For a culture like this country, I feel like I’m spending a lot of my time by myself. So yes, I’m turning more and more to [ChatGPT], and I think the loneliness and the isolation has a lot to do with that. […]  That’s what they say, that being alone is compared to smoking fifteen cigarettes a day. So in a lot of ways, I have seen life here to be like that. Doesn’t matter how much money you make or anything. It’s not fulfilling because there’s no real human interaction. And we need that.”

The irony is even apparent in the language we use. The word companion comes from the late Latin “cum panis” — someone with whom we break bread. These days, many of us have fewer and fewer “bread fellows,” but are turning to digital “companions.”

Witnessing Aloneness: The Ethics of Listening

During our interviews, participants often began with a disclaimer: they knew the chatbot wasn’t real, that it didn’t truly think or feel. Yet, even with this awareness, there was something tender in how they described the relief, comfort, and freedom to say things they couldn’t say elsewhere. 

These stories complicate the many public narratives that dismiss AI companionship as something that invariably leads to confusion, dependency, or psychosis. People who use chatbots for reflection are not naïve, and some understand the technology’s limits more intimately than its critics. While our participants know the chatbot is neither “real” nor “intelligent,” they also know that the feelings it elicits in them are genuine. Some would rather be in deep relation with a friend or see a mental health professional, but do not feel they can access either. 

This sort of uneasiness defines our research process, too. Our project relies on diaries, interviews, and focus groups where participants recount intensely personal exchanges. Many share screenshots of their conversations. We are often the first people they have spoken to about this part of their life, and stand in the delicate and privileged position of being witnesses to their loneliness, curiosity, desire, humor, creativity, and grief.

As ethnographers trained in computer and information science as well as anthropology, we move between analysis and empathy, between curiosity and concern, between the machine and the human. Studying AI chatbots often means acknowledging the power of illusion while refusing to ridicule it. If we treat emotional attachment to a machine as pathology, we miss the deeper insight, which is the ever-present human desire for a meaningful exchange. 

The value of this work lies not in sensational quotes about “AI lovers” but in the more ordinary stories of people who use chatbots to steady themselves through uncertainty. These accounts remind us that loneliness is not necessarily a problem to fix, but a condition in a landscape that each of us navigates. Evaluating technologies of care means looking beyond performance metrics and assessing how these systems influence the norms and expectations that define care itself.

The Work of Being With Ourselves, Being with Others

Part of our work is to understand what being alone means in a time when every silence can be filled with words from a machine. The people in our study remind us that solitude is not the enemy of connection. Indeed, when solitude offers space to sit with one’s own thoughts, rehearse empathy, or gather the courage to reach outward again, it can be a form of care. But when reflection turns to reliance, or when corporate systems exploit those moments for profit, being alone becomes isolation — by design.

For this reason, we have been incorporating communal moments of sharing in our focus groups. Participants not only speak about their chatbot use but also turn toward one another by asking questions, comparing experiences, and offering advice. Some share helpful techniques, while others exchange resources about privacy, therapy, and finding balance between being online and offline. What began as a research exercise has become something closer to a conversation circle, where isolation gives way to mutual recognition. 

For now, our task as researchers is modest but urgent: to attend to these stories without judgment; to show that talking to a chatbot can be both a cry for help and a gesture of self-knowledge; to foster communal dialogue among our participants that creates reflection and relation with one another. In a world saturated with noise, perhaps the most radical act is to take aloneness not as absence, but as an opportunity to listen to others as well as ourselves. 

“Loneliness is personal, and it is also political…What matters is kindness; what matters is solidarity,” Olivia Laing reminds us. “What matters is staying alert, staying open.”