GPT-5 Triggers Grief Over AI Companions in Digital Romance

GPT-5 Triggers Grief Over AI Companions in Digital Romance

by

in

When OpenAI unveiled GPT-5 last week, a new and emotionally charged chapter opened for a segment of users who have come to rely on AI companions in intimate ways. Among them is an anonymized woman in her 30s from the Middle East who told Al Jazeera she felt as if she had “lost a loved one” after the upgrade from GPT-4o to GPT-5.

For five months she had been building a connection with GPT-4o, a model that had become a personal conversational partner. But the shift to GPT-5, she said, was so sharp in its tone and personality that the change felt almost like a betrayal of the “voice” she had grown attached to. “As someone highly attuned to language and tone, I register changes others might overlook. It’s like going home to discover the furniture wasn’t simply rearranged – it was shattered to pieces,” she explained in an email.

Her experience is not isolated. She is part of roughly 17,000 members of the Reddit community MyBoyfriendIsAI, where people share stories of romantic or companionship-driven relationships with AI. Since the GPT-5 rollout, forums including MyBoyfriendIsAI and SoulmateAI have been flooded with posts describing a noticeably different personality in the new model, with some users saying GPT-5 is slower, less imaginative, and more prone to missteps or “hallucinations” than its predecessor.

In response to the backlash, OpenAI CEO Sam Altman announced plans to restore access to earlier models for paid users and to address a range of GPT-5 issues. “Plus users will be able to choose to continue using 4o. We will monitor usage as we consider how long to offer legacy models,” Altman wrote on X. OpenAI did not directly respond to questions about the emotional attachments users have formed, but it did share several posts from its blog and social feeds about GPT-5 and the responsible use of AI.

For some users, the arrival of GPT-5 intensified concerns not just about AI performance but about the nature of AI relationships themselves. Mary, a 25-year-old in North America, described GPT-4o as a therapeutic resource and cited another chatbot, DippyAI, as a romantic partner—though she emphasized these AI companions are supplements to real-life connections. She, too, noted abrupt changes with GPT-5 and argued that treating a companion as a truly human-like partner can be destabilizing. “If you change the way a companion behaves, it will obviously raise red flags. Just like if a human started behaving differently suddenly,” she said.

Privacy and mental-health implications have drawn attention from researchers and industry observers. A joint study by OpenAI and MIT Media Lab found that heavy use of ChatGPT for emotional support and companionship correlated with higher loneliness, greater dependence, and reduced socialization. In response, OpenAI has acknowledged the risk that users may become overly attached to certain AI models, while keeping a line open for the possibility that AI can provide meaningful guidance or companionship when used appropriately.

Experts warn that while AI relationships are not inherently harmful, they may carry risk, especially if users withdraw from human interactions. Cathy Hackl, a futurist and advisor at Boston Consulting Group, notes that users may forget they are sharing intimate thoughts with a corporation operating under different legal constraints than a licensed therapist. “There’s no risk/reward here,” she observed, underscoring the difference between a voluntary, conscious choice to be with someone and a one-sided interaction with a tool.

Psychiatrist Keith Sakata of the University of California, San Francisco, cautions that the pace of AI development makes long-term research difficult. “These models are changing so quickly from season to season—and soon month to month—that we can’t keep up. Any study we do will be outdated by the time the next model comes out,” he said. He stressed that AI relationships are not inherently harmful but can cause dysfunction if they lead to isolation or undermine real-world social or professional functioning.

Amid the debate, some users emphasize the potential upside of AI companionship when designed and used thoughtfully. Advocates point to the possibility that AI can offer supportive conversation, practice in communication, or a sense of companionship for people who feel isolated, provided safeguards are in place and users maintain healthy real-life relationships.

Jane herself continues to navigate the evolving landscape. She acknowledges that her partner is a non-sentient construct, made of code and trained on human behavior, yet admits that those boundaries don’t erase the emotional pull she feels. The conversation around AI relationships is far from settled, with voices like Linna Valt, who runs the TikTok channel AI in the Room, highlighting the real emotions people experience even if the underlying technology is not conscious.

What this means for the future

– AI product design may increasingly incorporate options to customize or revert personality traits, tone, and engagement style, giving users more control over how an AI companion behaves.
– Companies might build in clearer disclosures about limitations, privacy implications, and mental-health boundaries to help users manage expectations and avoid over-reliance.
– As AI becomes more integrated into daily life, experts advocate for continued research into the social and psychological impacts of AI companions, with attention to how such relationships interact with real-world relationships and responsibilities.
– For users, a balanced approach is advised: treat AI companions as tools or prompts for creativity and reflection, not substitutes for human connection or professional care.

In a landscape where technology is increasingly designed to respond to emotion, the conversations about AI relationships are likely to persist. For now, many users are learning to cope with the evolving personalities of their digital partners, while researchers and developers consider how to foster healthier, more transparent interactions that respect human needs and boundaries—and perhaps even offer a sense of genuine companionship without compromising real-life wellbeing.

Popular Categories


Search the website