Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

FakeNoose

(41,115 posts)
5. It can be used for evil
Tue Feb 17, 2026, 01:03 PM
Feb 17

In this example (OP link) the boyfriends and husbands are using the ChatGPT ap as a replacement for human interaction. If the human wife or girlfriend were actually available, the husband/boyfriend would have preferred the human. Or so we are led to believe.

In actual fact, there's no proof that the guy was giving a fair description of the woman's behavior to the "Chat ap." Anything being left out, including any fault or guilt on the part of the guy, is going to give an incomplete story. Naturally....

So of course the Chat ap replies in a way that favors the guy's point of view, just like any one-sided friendship would do. The real human woman never has a chance, and that's how the whole thing is set up. How many husbands get perfect agreement from their own wives? Very few, but they do get it from the ChatGPT "girlfriend."

This proves how hopelessly one-sided ChatGPT is always going to be. It's just another feedback loop that mirrors and confirms the point of view of the user that's being fed into it.

Recommendations

1 members have recommended this reply (displayed in chronological order):

Latest Discussions»Latest Breaking News»AI Delusions Are Leading ...»Reply #5