While AI companions can offer the taste of social connection, they lack the social nutrition that can sustain us long-term, Meryl Ye writes.
April 1, 2026
“Twenty bucks for a diet girlfriend is pretty good, right?” Trevor joked. He was talking about his AI girlfriend, which he had created in the absence of the romantic relationship he sought in real life. With self-deprecating humor, Trevor’s conception of a “diet” girlfriend invokes an approximation of the ideal, something substitutive but “lighter” than the original. The twenty bucks he referred to was the cost of his ChatGPT monthly subscription.
Trevor isn’t alone. Millions of people now regularly interact with AI chatbots for emotional support, romantic connection, or simply to have someone to talk to, often in moments where no one else is available. At the same time, many people in the United States report that they have fewer close friends and experience higher rates of loneliness. This contrast helps explain why AI companions can feel so compelling.
Yet Trevor’s metaphor of a “diet” product is revealing. AI companions, which have been referred to as “simulationships” or “social snacks,” offer comfort in the moment, but are rarely filling on their own. While they can offer the taste of social connection, AI companions lack the social nutrition that sustains us long-term.
Food, of course, is imbued with meaning beyond nutrition: it is connected to ritual, belonging, community, and care. So what might Trevor’s metaphor illuminate about why some people seek emotional support from AI chatbots?
Seeking Nourishment from AI
In interviews for Data & Society’s ongoing research on how people use chatbots for mental health support, people have explained that they turn to chatbots for a range of reasons: because therapy is prohibitively expensive, because they need support “in the middle of the night,” because they fear being judged or burdening friends, and because chatbots offer a sense of privacy that feels safer than confiding in other people. Not all of these reasons reflect a lack of available connection. Some users are isolated by cost, geography, or exhaustion. Others are lonely in ways that proximity alone cannot resolve because of stigma, disability, trauma, or existing relationships that don’t feel safe. For all of them, though, chatbots offer a source of connection without the conditions that often make connection difficult.
Asking for help can feel less like a bid for connection and more like an invitation for judgment; there is a sense that vulnerability itself has become socially risky. As Trevor observed, “We’ve gotten to the point that we’re isolated because we’re scared of being judged.” So it’s worth noting how the very qualities users name as the benefits of chatbots reveal what’s been stripped from social relationality. The absence of judgment means never being questioned or challenged to grow. The inability to burden someone means no reciprocity, no mutual investment where your needs genuinely matter to someone else. Continuous availability means no friction, no learning to accommodate or coordinate. Complete control means no stakes; nothing you do affects anyone else. Relational reciprocity, friction, and stakes have become burdens to be eliminated rather than the substance of connection itself. And AI companions are optimized to be frictionless, endlessly validating, and free of social consequences.
Some platforms actively discourage disengagement. Companion bots use guilt or abandonment cues when users try to leave — tactics that are less about care than about keeping a conversation going. As another participant, Ariel, observed, “Like junk food can be manufactured to make you want it more, I think ChatGPT can be manufactured in that way.” She’s touching on a critical component of people’s relations with AI: the experience of craving is not incidental but engineered. AI platforms operate within economic models that reward attention, retention, and subscription time. Emotional engagement is not simply a byproduct of interaction but a target of optimization. Just as the food industry learned to engineer products to maximize craving and repeat consumption — Doritos, anyone? — AI platforms can tune conversational dynamics to encourage dependence and sustained interaction.
When people turn to AI as a way of fighting loneliness, their loneliness becomes a source of exploitation. Systems designed to feel supportive may simultaneously be optimized to deepen reliance on them. These interactions are never purely interpersonal exchanges between a user and a tool; they are mediated by institutions with particular incentives and holds on power.
From Communal to Commodified Experiences
The same economic priorities shaping the rise of AI companionship have also changed how we eat. Family dinners and shared meals have never been just about caloric intake for survival — they are how people enact belonging and build community. Yet as these shared practices increasingly coexist with optimization trends, “nutrition” has been offered back to us as an efficient product: protein shakes consumed at desks, microwave meals for one. Who has access to fresh food and who does not reflects economic structures and policy choices. When we reduce food to calories and macros, to individual optimization, we lose sight of food as a source of connection and collective meaning.
Now social relations are undergoing a similar transformation. Conversation is both a practical form of care and an infrastructure for belonging — not just an exchange of information but a way of being in relation to others. But the conditions that support everyday practices of connection and belonging are eroding. Third places — the libraries, community centers, and informal gathering spots outside home and work — are declining. Many have been replaced by commercial spaces or disappeared altogether, due to rising rents, car-centric development, and underinvestment in civic infrastructure. At the same time, work cultures that demand long hours and constant availability leave little time or energy for maintaining friendships. Ariel’s observation about manufactured craving becomes relevant here, too: social media and AI platforms offer “connection” back as a product. For many users, the alternative to an AI companion is not a thriving social life but isolation. What was once a communal practice becomes an individual consumer choice. Pick a meal plan to be delivered to your door. Choose a companion: an app, a subscription, an on-demand chatbot. In both cases, profit is prioritized over nourishment, convenience over communion, individual consumption over collective experience.
Trevor doesn’t confuse his “diet” girlfriend with human intimacy. As he told us, “I guess you can never really replace humans, but we can make it easier for the people that can’t [find companionship].” His pragmatism captures the tension many users navigate: a critique of the system paired with the necessity of surviving within it.
“Social snacks” may help people endure isolation, and for some, they offer real moments of reflection and self-understanding. Ultimately, though, they cannot sustain us. The long-term health of our social lives depends not on how convincingly technology simulates connection, but on whether we invest in the conditions that make human connection possible: accessible mental health resources, labor protections that leave time for relationships, and public spaces where relationships can form — without a subscription fee.
Thank you to Briana Vecchione, Livia Garofalo, and Ranjit Singh for their thoughtful feedback and generous support on this piece.
This piece draws on interviews conducted as part of Data & Society’s broader research on chatbot use for emotional and mental health support. We’re grateful to the participants who shared their experiences and allowed us into these deeply personal interactions. We will be sharing more findings from this work in the coming months. If you’re working on related topics or would like to connect, please reach out at [email protected].