Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(57,329 posts)
19. I've posted a number of threads here on articles about the risks posed by AI companions, but this
Fri Jan 17, 2025, 03:00 AM
Jan 2025

article might be the most worrisome.

For one thing, it's about an AI companion created via ChatGPT, which is much more widely used than companies like Replika.

For another, the article quotes a therapist whose take on these relationships is so wrong it's frightening:

Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.

“What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”


If it isn't reciprocal, it isn't a real relationship. It's a fantasy.

And the suggestion that the only thing that matters in a relationship is the release of neurotransmitters in the brain is completely amoral and IMO inhumane. It can justify all sorts of unhealthy "relationships" from fantasies that leave people unable to cope with reality, to actual relationships that are very harmful to one or both partners.

The relationship this NY Times article describes is delusional, addicting, and exploitative.

Recommendations

0 members have recommended this reply (displayed in chronological order):

NEXT: AI orders user to eat dog shit dalton99a Jan 2025 #1
and they eat it while gladly paying 1k a month for the privilege. Native Jan 2025 #2
She is being exploited. milestogo Jan 2025 #3
we are all being exploited, and this is only going to get much, much worse for all of us. Native Jan 2025 #9
The woman who is the subject of the article is addicted to her digital partner. John1956PA Jan 2025 #4
" the companion's "memory" of past chats falters" yagotme Jan 2025 #5
when asked what she would be willing to pay a month if she didn't have to reset/program her "boyfriend," she said $1000. Native Jan 2025 #7
It sounds like a scam? Irish_Dem Jan 2025 #16
There is no "they." "They" don't have sex. She masturbates while reading text generated from a LLM. WhiskeyGrinder Jan 2025 #6
or listening. Chat GPT can now read its responses aloud. Native Jan 2025 #11
For fuck's sake... Blue_Tires Jan 2025 #8
No chance of getting STD's, but might just contract a nasty virus. Ferrets are Cool Jan 2025 #10
Finally!!!! I've been so waiting for this response. Thank you. Native Jan 2025 #12
... Ferrets are Cool Jan 2025 #15
It sounds like her marriage is on the rocks as well, or soon will be. keep_left Jan 2025 #13
My jaw dropped when I read in the New York Times article that she had a husband. Native Jan 2025 #14
So people create their own love partners who are not real. Irish_Dem Jan 2025 #17
I sent this to a practicing psychologist I know and her response was WOW. Native Jan 2025 #18
I've posted a number of threads here on articles about the risks posed by AI companions, but this highplainsdem Jan 2025 #19
I thought this was the most worrisome article I've read as well, and I've pretty much read them all too. Native Jan 2025 #20
Latest Discussions»General Discussion»She Is in Love With ChatG...»Reply #19