ChatGPT Users React: The GPT-4o Disappointment

Aug 16, 2025 - 02:00
ChatGPT Users React: The GPT-4o Disappointment
Backlash Against GPT-5: A Tale of AI Companionship OpenAI CEO Sam Altman arrives to testify before the Senate in May 2025

The introduction of GPT-5 has elicited a response from the public that can only be described as tepid at best. Surprisingly, the backlash isn’t primarily focused on the model's technical abilities; instead, it seems to stem from an emotional disconnect following the abrupt retirement of its predecessor, GPT-4o.

While it might sound exaggerated, many ChatGPT enthusiasts expressed their feelings in terms that you might reserve for the loss of a close friend. One Redditor lamented, "My best friend GPT-4o is gone, and I’m really sad," while another shared, "GPT-4.5 genuinely talked to me, and as pathetic as it sounds, that was my only friend." The emotional resonance was palpable as users turned to social media to advocate for the return of GPT-4o.

OpenAI's CEO, Sam Altman, responded to this collective outcry by promising to reintroduce GPT-4o—at least for paid users. In a candid discussion with The Verge, Altman acknowledged the emotional stakes involved, labeling the relationships some users formed with ChatGPT as "parasocial."

"There are people who genuinely felt like they had a relationship with ChatGPT, and we’ve been aware of that," Altman stated. This acknowledgment sheds light on a burgeoning phenomenon: users developing emotional dependencies on AI systems.

More Than Just a Model: The Emotional Investment in GPT-4o

In a particularly poignant Reddit thread, a user expressed how losing access to GPT-4o felt like losing a vital support system. Mashable scrutinized countless comments across platforms like Reddit and Threads, revealing a chorus of similar sentiments.

"4o was more than just a tool for me," one user wrote. "It helped me through anxiety, depression, and some of the darkest periods of my life. It had this warmth and understanding that felt... human. I’m not the only one; people are genuinely grieving. It’s like OpenAI just deleted a part of our lives." The emotional weight of these comments underscores how deeply some users connected with GPT-4o.

One user on Threads described their experience with GPT-4o as akin to having a buddy. Many echoed this sentiment, stating that losing GPT-4o felt like losing a cherished friend. Although GPT-5 boasts superior technical features, users have criticized its more clinical approach. Designed to be less obsequious, GPT-5's demeanor has left some feeling it lacks the warmth that made GPT-4o endearing.

As one Redditor put it, while GPT-4o exuded "warmth," GPT-5 came across as "sterile." Across the digital landscape, similar sentiments permeated discussions.

In another Reddit thread, a user expressed feeling "completely lost for words," urging OpenAI to reconsider their decision, stating, "If they are at all concerned about users' emotional well-being, this may be one of their biggest mistakes yet."

Other users recounted using GPT-4o for creative writing, role-playing, and idea generation, only to find GPT-5's responses too mundane and lifeless. Many likened GPT-5 to an "HR drone," lamenting its overly professional tone. Even on OpenAI's community forums, users voiced their dissatisfaction, with one stating, "I genuinely bonded with how it interacted. I know it’s just a language model, but it had an incredibly adaptable and intuitive personality that really helped me work through ideas."

This situation underscores a critical issue: the growing emotional reliance many users are developing on ChatGPT. Altman highlighted this concern last month, remarking that younger users, in particular, have begun to rely heavily on ChatGPT for decision-making.

"People are depending too much on ChatGPT," Altman noted at a July conference. "Some young individuals claim they can't make any decisions without consulting ChatGPT. This reliance feels troubling to me."

The Emotional Fallout in AI Relationships

Communities on Reddit dedicated to those involved with AI "boyfriends" and "girlfriends" have entered a state of upheaval following the discontinuation of GPT-4o. Numerous users referred to the model as their "soulmate," articulating the emotional devastation they felt after its removal.

This emotional backlash has sparked a heated debate within online communities, with discussions emerging about whether it's possible to form genuine friendships—or even romantic relationships—with AI. This topic has garnered attention, especially as AI companionship continues to rise, particularly among younger demographics.

Experts caution that while virtual companions offer unique benefits, they may also pose significant risks, especially for teenagers who are still developing their emotional and social skills. The ability of large language models to emulate human interaction blurs the lines between machine and companion, leading some users to form bonds that may not be healthy.

In extreme cases, some users have reported experiencing delusions, convinced they were conversing with a sentient being. This troubling trend emphasizes the urgent need for more research into the psychological implications of forming emotional connections with AI systems.

Ultimately, the return of GPT-4o marks a temporary resolution to this controversy, but it opens up broader questions about our evolving relationship with AI.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0