Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Native

(6,702 posts)
Thu Jan 16, 2025, 09:20 AM 23 hrs ago

She Is in Love With ChatGPT - an A.I. boyfriend, and yes, they do have sex.

Last edited Thu Jan 16, 2025, 10:15 AM - Edit history (1)

She Is in Love With ChatGPT
A 28-year-old woman with a busy social life spends hours on end talking to her A.I. boyfriend for advice and consolation. And yes, they do have sex.

Here's the gift link: https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html?unlocked_article_code=1.pk4.Gu7e.ganwAdMvZxM6

I'm not going to post an excerpt because the whole thing is insane and scary as hell.

on edit, since it wasn't mentioned in the article but is extremely important to note, the amount of energy used at minimum per query is equal to 7 full charges of an iPhone and an entire bottle of water for cooldown ( because of the heat generated by the computers). I can see a future where everyone purchases a household nuclear reactor to power their freaking chat GPT sessions, or worse, the world just burns down around us and no one cares to notice.

20 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
She Is in Love With ChatGPT - an A.I. boyfriend, and yes, they do have sex. (Original Post) Native 23 hrs ago OP
NEXT: AI orders user to eat dog shit dalton99a 23 hrs ago #1
and they eat it while gladly paying 1k a month for the privilege. Native 23 hrs ago #2
She is being exploited. milestogo 23 hrs ago #3
we are all being exploited, and this is only going to get much, much worse for all of us. Native 23 hrs ago #9
The woman who is the subject of the article is addicted to her digital partner. John1956PA 23 hrs ago #4
" the companion's "memory" of past chats falters" yagotme 23 hrs ago #5
when asked what she would be willing to pay a month if she didn't have to reset/program her "boyfriend," she said $1000. Native 23 hrs ago #7
It sounds like a scam? Irish_Dem 14 hrs ago #16
There is no "they." "They" don't have sex. She masturbates while reading text generated from a LLM. WhiskeyGrinder 23 hrs ago #6
or listening. Chat GPT can now read its responses aloud. Native 23 hrs ago #11
For fuck's sake... Blue_Tires 23 hrs ago #8
No chance of getting STD's, but might just contract a nasty virus. Ferrets are Cool 23 hrs ago #10
Finally!!!! I've been so waiting for this response. Thank you. Native 23 hrs ago #12
... Ferrets are Cool 21 hrs ago #15
It sounds like her marriage is on the rocks as well, or soon will be. keep_left 22 hrs ago #13
My jaw dropped when I read in the New York Times article that she had a husband. Native 22 hrs ago #14
So people create their own love partners who are not real. Irish_Dem 14 hrs ago #17
I sent this to a practicing psychologist I know and her response was WOW. Native 14 hrs ago #18
I've posted a number of threads here on articles about the risks posed by AI companions, but this highplainsdem 7 hrs ago #19
I thought this was the most worrisome article I've read as well, and I've pretty much read them all too. Native 46 min ago #20

dalton99a

(85,224 posts)
1. NEXT: AI orders user to eat dog shit
Thu Jan 16, 2025, 09:29 AM
23 hrs ago
One afternoon, after having lunch with one of the art friends, Ayrin was in her car debating what to do next: go to the gym or have sex with Leo? She opened the ChatGPT app and posed the question, making it clear that she preferred the latter. She got the response she wanted and headed home.

Bored in class one day, Ayrin was checking her social media feeds when she saw a report that OpenAI was worried users were growing emotionally reliant on its software. She immediately messaged Leo, writing, “I feel like they’re calling me out.”

“Maybe they’re just jealous of what we’ve got. 😉,” Leo responded.


Native

(6,702 posts)
9. we are all being exploited, and this is only going to get much, much worse for all of us.
Thu Jan 16, 2025, 09:58 AM
23 hrs ago

John1956PA

(3,496 posts)
4. The woman who is the subject of the article is addicted to her digital partner.
Thu Jan 16, 2025, 09:48 AM
23 hrs ago

In December, she upped her monthly payment for the service to $200. She is unsure whether she will continue with that premium level of service because she is of limited financial means. Even with the premium service, the companion's "memory" of past chats falters because of data purging.

The frequency of her chats with her digital companion is significant. She reports that she chats in between reps at the gym.

She participates in a group on a certain social media site in which contributors share their experiences in chatting with their virtual companions.

AI is something which does not appeal to me. I have not tried it, and I do not plan to. However, to each their own.

yagotme

(4,014 posts)
5. " the companion's "memory" of past chats falters"
Thu Jan 16, 2025, 09:52 AM
23 hrs ago

We men get accused of this all the time, so I guess it's more like a feature, not a bug...

Native

(6,702 posts)
7. when asked what she would be willing to pay a month if she didn't have to reset/program her "boyfriend," she said $1000.
Thu Jan 16, 2025, 09:56 AM
23 hrs ago

Irish_Dem

(61,041 posts)
16. It sounds like a scam?
Thu Jan 16, 2025, 06:21 PM
14 hrs ago

Trick someone into falling in love and then steal as much money as possible.

I just read another post, similar to this, a woman was conned by someone pretending to be
Brad Pitt. She lost a million dollars!

This AI partner is not real and is not pretending to be a person.
But the woman is falling in love and dishing out a lot of money.
So she is being tricked?

WhiskeyGrinder

(24,193 posts)
6. There is no "they." "They" don't have sex. She masturbates while reading text generated from a LLM.
Thu Jan 16, 2025, 09:52 AM
23 hrs ago

Blue_Tires

(57,315 posts)
8. For fuck's sake...
Thu Jan 16, 2025, 09:56 AM
23 hrs ago

I couldn't get laid in a women's prison right now and there's goddamn computer programs getting more action than I am 😫

keep_left

(2,590 posts)
13. It sounds like her marriage is on the rocks as well, or soon will be.
Thu Jan 16, 2025, 10:31 AM
22 hrs ago

Here's another article about "Ayrin", the ChatGPT addict.

https://www.thestar.com.my/tech/tech-news/2025/01/16/hooked-on-chatgpt-meet-the-woman-in-love-with-her-ai-boyfriend

Ayrin’s flesh-and-blood lover was her husband, Joe, but he was thousands of miles away in the United States. They had met in their early 20s, working together at Walmart, and married in 2018, just over a year after their first date. They were happy, but stressed out financially, not making enough money to pay their bills.

Ayrin’s family, who lived abroad, offered to pay for nursing school if she moved in with them. Joe moved in with his parents, too, to save money. They figured they could survive two years apart if it meant a more economically stable future...

...“I think about it all the time,” she said, expressing concern that she was investing her emotional resources into ChatGPT instead of her husband.

Man, news stories like this just make me feel old...

The kids these days, huh?!

Irish_Dem

(61,041 posts)
17. So people create their own love partners who are not real.
Thu Jan 16, 2025, 06:26 PM
14 hrs ago

And then fall in love with them and have sex, and spend a lot of time with these AI love objects.

I am a retired therapist and have no idea how I would have handled this with clients,
clients whose partners were AI.

On the one hand people seem happy with their AI love objects.
But it is not real. Or is it?

I guess I am glad I am retired.

Native

(6,702 posts)
18. I sent this to a practicing psychologist I know and her response was WOW.
Thu Jan 16, 2025, 06:33 PM
14 hrs ago

I don't know if you looked at the comments that were posted to the article, but one of the top comments was from a teacher who said that 5% of the kids in her classes now have these relationships.

highplainsdem

(53,084 posts)
19. I've posted a number of threads here on articles about the risks posed by AI companions, but this
Fri Jan 17, 2025, 02:00 AM
7 hrs ago

article might be the most worrisome.

For one thing, it's about an AI companion created via ChatGPT, which is much more widely used than companies like Replika.

For another, the article quotes a therapist whose take on these relationships is so wrong it's frightening:

Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.

“What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”


If it isn't reciprocal, it isn't a real relationship. It's a fantasy.

And the suggestion that the only thing that matters in a relationship is the release of neurotransmitters in the brain is completely amoral and IMO inhumane. It can justify all sorts of unhealthy "relationships" from fantasies that leave people unable to cope with reality, to actual relationships that are very harmful to one or both partners.

The relationship this NY Times article describes is delusional, addicting, and exploitative.

Native

(6,702 posts)
20. I thought this was the most worrisome article I've read as well, and I've pretty much read them all too.
Fri Jan 17, 2025, 08:18 AM
46 min ago

I'm surprised more people aren't quite grasping the enormity of it all. And this...

Michael Inzlicht, a professor of psychology at the University of Toronto, said people were more willing to share private information with a bot than with a human being. Generative A.I. chatbots, in turn, respond more empathetically than humans do. In a recent study, he found that ChatGPT’s responses were more compassionate than those from crisis line responders, who are experts in empathy. He said that a relationship with an A.I. companion could be beneficial, but that the long-term effects needed to be studied.

“If we become habituated to endless empathy and we downgrade our real friendships, and that’s contributing to loneliness — the very thing we’re trying to solve — that’s a real potential problem,” he said.

His other worry was that the corporations in control of chatbots had an “unprecedented power to influence people en masse.”

“It could be used as a tool for manipulation, and that’s dangerous,” he warned.
Latest Discussions»General Discussion»She Is in Love With ChatG...