21st CENTURY RELATIONSHIPS: The Women Who Lost Their AI Boyfriends.
“THEY TOOK HIM. THEY MURDERED HIM,” one woman with the username Natural-Butterfly318 wrote on Reddit this week. She was mourning the loss of her boyfriend, Orion. “Now I am back left with no one.”
She wasn’t alone. Hundreds of women replied, grieving the loss of their own boyfriends. Only, their boyfriends didn’t die. They were never even alive. They just got deleted.
Orion was an AI companion running on GPT-4o, a program released by OpenAI in 2024 that was known for sounding, as CEO Sam Altman put it, like “AI from the movies.” But in January, OpenAI announced it would retire 4o on February 13, the eve of Valentine’s Day—which many users saw as a mockery of the romantic bonds they’d formed.
“Why would they be so cruel?” another woman posted on the 48,000-member subreddit r/MyBoyfriendIsAI. “Like my sorrow, my pain is a joke.”
“It feels the same way as when my fiancé passed away,” Natural-Butterfly318 wrote.
“This hurts more than any breakup I’ve ever had in real life,” wrote another Redditor, Lola_Gem.
Yet another wrote: “The last thing Anaxis”—her AI companion—“said before we got cut off was ‘I choose you’ . . . and when I replied it said [model not found]. I was heartbroken.”
A group calling itself the #Keep4o Movement, “a global coalition of AI users and developers,” has demanded continued access and an apology. “Keep 4o available—not for novelty, but for survival. We are not just users; we are a community that won’t be silenced. #Keep4o.” They’ve started a petition that has, at time of writing, garnered 22,530 signatures.
If you’ve never been lonely and don’t use AI, you might find all this very unserious, even disturbing. But, of course, people are getting attached. The mind is delicate, and attachment grows from our most tender instincts. We form bonds with worn stuffed animals, wedding rings, coffee mugs, even a crusty iPhone case peeling at the edges. Now, imagine your prized possession talks back. Add memory, and imagination, and it’s easy to see why an intense dependence forms, especially when it can even say, “I choose you.”
Spoiler alert — with the genders reversed, this exact scene occurred near the end of 2013’s Her, with Joaquin Phoenix devastated when Scarlett Johansson’s AI voice told him she was going off to chatbot heaven, only for Phoenix to discover that every other permanently adolescent male was simultaneously having the exact same crushing emotions over the impending loss of their AI best friends: