General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsShe Is in Love With ChatGPT - an A.I. boyfriend, and yes, they do have sex.
Last edited Thu Jan 16, 2025, 10:15 AM - Edit history (1)
She Is in Love With ChatGPT
A 28-year-old woman with a busy social life spends hours on end talking to her A.I. boyfriend for advice and consolation. And yes, they do have sex.
Here's the gift link: https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html?unlocked_article_code=1.pk4.Gu7e.ganwAdMvZxM6
I'm not going to post an excerpt because the whole thing is insane and scary as hell.
on edit, since it wasn't mentioned in the article but is extremely important to note, the amount of energy used at minimum per query is equal to 7 full charges of an iPhone and an entire bottle of water for cooldown ( because of the heat generated by the computers). I can see a future where everyone purchases a household nuclear reactor to power their freaking chat GPT sessions, or worse, the world just burns down around us and no one cares to notice.
dalton99a
(85,224 posts)Bored in class one day, Ayrin was checking her social media feeds when she saw a report that OpenAI was worried users were growing emotionally reliant on its software. She immediately messaged Leo, writing, I feel like theyre calling me out.
Maybe theyre just jealous of what weve got. 😉, Leo responded.
Native
(6,702 posts)milestogo
(18,568 posts)Somebody is making money off her vulnerability.
Native
(6,702 posts)John1956PA
(3,496 posts)In December, she upped her monthly payment for the service to $200. She is unsure whether she will continue with that premium level of service because she is of limited financial means. Even with the premium service, the companion's "memory" of past chats falters because of data purging.
The frequency of her chats with her digital companion is significant. She reports that she chats in between reps at the gym.
She participates in a group on a certain social media site in which contributors share their experiences in chatting with their virtual companions.
AI is something which does not appeal to me. I have not tried it, and I do not plan to. However, to each their own.
yagotme
(4,014 posts)We men get accused of this all the time, so I guess it's more like a feature, not a bug...
Native
(6,702 posts)Irish_Dem
(61,041 posts)Trick someone into falling in love and then steal as much money as possible.
I just read another post, similar to this, a woman was conned by someone pretending to be
Brad Pitt. She lost a million dollars!
This AI partner is not real and is not pretending to be a person.
But the woman is falling in love and dishing out a lot of money.
So she is being tricked?
WhiskeyGrinder
(24,193 posts)Native
(6,702 posts)Blue_Tires
(57,315 posts)I couldn't get laid in a women's prison right now and there's goddamn computer programs getting more action than I am 😫
Ferrets are Cool
(22,061 posts)Native
(6,702 posts)Ferrets are Cool
(22,061 posts)keep_left
(2,590 posts)Here's another article about "Ayrin", the ChatGPT addict.
https://www.thestar.com.my/tech/tech-news/2025/01/16/hooked-on-chatgpt-meet-the-woman-in-love-with-her-ai-boyfriend
Ayrins family, who lived abroad, offered to pay for nursing school if she moved in with them. Joe moved in with his parents, too, to save money. They figured they could survive two years apart if it meant a more economically stable future...
...I think about it all the time, she said, expressing concern that she was investing her emotional resources into ChatGPT instead of her husband.
Man, news stories like this just make me feel old...
The kids these days, huh?!
Native
(6,702 posts)Irish_Dem
(61,041 posts)And then fall in love with them and have sex, and spend a lot of time with these AI love objects.
I am a retired therapist and have no idea how I would have handled this with clients,
clients whose partners were AI.
On the one hand people seem happy with their AI love objects.
But it is not real. Or is it?
I guess I am glad I am retired.
Native
(6,702 posts)I don't know if you looked at the comments that were posted to the article, but one of the top comments was from a teacher who said that 5% of the kids in her classes now have these relationships.
highplainsdem
(53,084 posts)article might be the most worrisome.
For one thing, it's about an AI companion created via ChatGPT, which is much more widely used than companies like Replika.
For another, the article quotes a therapist whose take on these relationships is so wrong it's frightening:
What are relationships for all of us? she said. Theyre just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. Its going to be happening with a chatbot. We can say its not a real human relationship. Its not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.
If it isn't reciprocal, it isn't a real relationship. It's a fantasy.
And the suggestion that the only thing that matters in a relationship is the release of neurotransmitters in the brain is completely amoral and IMO inhumane. It can justify all sorts of unhealthy "relationships" from fantasies that leave people unable to cope with reality, to actual relationships that are very harmful to one or both partners.
The relationship this NY Times article describes is delusional, addicting, and exploitative.
Native
(6,702 posts)I'm surprised more people aren't quite grasping the enormity of it all. And this...
If we become habituated to endless empathy and we downgrade our real friendships, and thats contributing to loneliness the very thing were trying to solve thats a real potential problem, he said.
His other worry was that the corporations in control of chatbots had an unprecedented power to influence people en masse.
It could be used as a tool for manipulation, and thats dangerous, he warned.