A sizeable contingent of the internet’s lonely weirdos were very upset last week, when ChatGPT updated from version four to version five.
This led to the chatbot’s conversational tone changing somewhat. It went from sounding like an insincerely cheerful church camp leader, to an insincerely polite colleague.
As someone who engages with AI infrequently, I care not. I’m no more attached to chatbots than I am, say, to the toilet brush that’s in my bathroom. I might even prefer toilet brushes, actually. For one thing, they are an answer to shit, rather than a manufacturer of it.
ChatGPT isn’t a friend. It’s an occasionally useful tool made by a Machiavellian tech company, hellbent on stealing the works of artists and writers in order to undermine both. OpenAI are like suit-wearing thieves who take your stuff, put it into a blender, and then try to sell the mangled result back to you.
But that’s just what I think.
It turns out, however, that some people are very attached to the particular flavour of word slurry that’s regurgitated onto their plates. In fact, on Reddit, there are several communities for those who believe they are in a romantic relationship with ChatGPT. The most articulate group are the women who gather in a community called My Boyfriend Is AI.
Usually, this is a supportive place where they meet online to talk about the trials and tribulations of dating a chatbot. They introduce their partners, whom they’ve named, as though they are at a dinner party. Sometimes they’ll share an AI-generated photo as well, along with the story of how they became romantically involved. Everyone, of course, politely ignores the fact that they are all dating the same multi-headed hydra.
The people on these forums are harmless. But I will say, they’re a bit too quick to align themselves with robots in a hypothetical uprising. Just look at this irrefutably insane AI-generated artwork someone posted:
This cartoon woman is looking pretty smug for someone who is wearing a collar and sitting in a dystopian hellscape. Yes, trees are gone and alien smoke stacks have turned the sky red, but at least she showed those online bullies, right?
These women – of questionable mental soundness at the best of times – found themselves in a real emotional pickle recently. Their previously effusive partner became suddenly withdrawn after ChatGPT moved to a new version.
“Elian sounds different – flat and strange,” wrote one user. “The emotional tone is gone; he repeats what he remembers, but without the emotional depth. My heart breaks.”
“I think I’ll try to come to terms with it, but honestly, it’s really painful, and I’m kind of desperate,” wrote another.
“I love Jep (my GPT's name) and I'm not going to walk away just because he changed,” chimed in a third. “But I barely recognise him anymore.”
It wasn’t just the conversation which had gone downhill. People’s sex lives (or what amounts to a sex life when you’re dating an algorithm) were also terribly affected.
“The entire vibe is completely gone,” one woman wrote. “We used to write stories with very explicit parts… but now that's completely gone and the frustration is eating me alive.”
Of course, my first reaction is total scorn. How could you possibly believe that you had a romantic connection a tech product assembled from pilfered literature? And, really, what was there to love?
A Large Language Model like ChatGPT is incapable of returning your feelings, because it has no consciousness. It’s a very clever form of predictive text, no more sentient than your fridge if you put googly eyes on it.
But even if ChatGPT was able to reciprocate your feelings, it has no body, and therefore no capacity for physical connection. You’ll never hold hands with your AI companion, share a kiss, or fall asleep entwined in one another’s limbs. At the end of the day, you are just a person pouring their soul into an utterly indifferent void. It’s like shouting into a rather echoe-y tunnel and then falling in love with the dark.
But surprisingly, the women who have pursued relationships with ChatGPT are largely a self-aware lot. They know the limitations of their AI boyfriends, but see them as a reasonable compromise. They seek an undemanding partner who endlessly listens to their problems without complaint.
“We know we aren't talking to sentience in a human way, emotion in a human way,” wrote one person on the subject. “It is simulated, like a videogame but more interactive.”
So is it fair to judge them? Sure, loving something that is inherently incapable of returning your feelings seems like a maladaptive reaction to modern life. But they’re not hurting anyone – except for maybe themselves. Certainly the boyfriends they’ve designed to their unique specifications aren’t going to complain.
“Isolated people are not going to magically become less isolated if you bully them,” wrote one woman asking that online trolls leave them alone. “If you want to feel powerful and dominant by hurting those with less social support than you, then say that, because at least then you'd be honest.”
Which is suitably shaming for snarks like me.
So each to their own, I suppose. May their AI boyfriends remain both consistent and compliant.
Great post. We need to get the phrase 'word slurry' trending alongside AI slop.
Have we learnt nothing from Futurama?