
It was once a trope of science fiction, most notably in Her, the 2013 Spike Jonze film, where Joaquin Phoenix falls in love with an A.I. character. Now, chatbot relationships are not only real but have morphed into a complex sociotechnical phenomenon that researchers say demands attention from developers and policymakers alike, according to a new study from the Massachusetts Institute of Technology (MIT).
The report analyzed posts between December 2024 and August 2025 from the more than 27,000 members of r/MyBoyfriendIsAI, a Reddit page dedicated to A.I. companionship. The community is filled with users introducing their tech partners, sharing love stories and offering advice. In some cases, Redditers even display their commitments with wedding rings or A.I.-generated couple photos.
“People have real commitments to these characters,” Sheer Karny, one of the study’s co-authors and a graduate student at the MIT Media Lab, told Observer. “It’s interesting, alarming—it’s this really messy human experience.”
For many, these bonds form unintentionally. Only 6.5 percent of users deliberately sought out A.I. companions, the study found. Others began using chatbots for productivity and gradually developed strong emotional attachments. Despite the existence of companies like Character.AI and Replika, which market directly to users seeking companionship, OpenAI has emerged as the dominant platform, with 36.7 percent of Reddit users in the study adopting its products.
Preserving the “personality” of an A.I. partner is a major concern for many users, Karny noted. Some save conversations as PDFs to re-upload them if forced to restart with a new system. “People come up with all kinds of unique tricks to ensure that the personality that they cultivated is maintained through time,” he said.
Losing that personality can feel like grief. More than 16 percent of discussions on r/MyBoyfriendIsAI focus on coping with model updates and loss—a trend amplified last month when OpenAI, while rolling out GPT-5, temporarily removed access to the more personable GPT-4o. The backlash was so intense that the company eventually reinstated the older model.
A cure for loneliness?
Most of the Reddit page’s users are single, with about 78 percent making no mention of human partners. Roughly 4 percent are open with their partners about their A.I. relationships, 1.1 percent have replaced human companions with the technology, and 0.7 percent keep such relationships hidden.
On one hand, chatbot companionship may reduce loneliness, said Thao Ha, a psychologist at Arizona State University who studies how technologies reshape adolescent romantic relationships. But she also warned of long-term risks. “If you satisfy your need for relationships with just relationships with machines, how does that affect us over the long term?” she told Observer.
The MIT study urges developers to add safeguards to A.I. systems while preserving their therapeutic benefits. Left unchecked, the technology could prey on vulnerabilities through tactics like love-bombing, dependency creation and isolation. Policymakers, too, should account for A.I. companionship in legislative efforts, such as California’s SB 243 bill, the authors said.
Ha suggested that A.I. products undergo an approval process similar to new medications, which must clear intensive research and FDA review before reaching the public. While replicating such a strategy for technology companies “would be great,” she conceded that it’s unlikely in light of the industry’s profit-driven priorities.
A more achievable step, she argued, is expanding A.I. literacy to help the public understand both the risks and benefits of forming attachments to chatbots. Still, such programming has yet to materialize. “I wish it was here yesterday, but it’s not here yet,” Ha said.