Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Sympthsical

(11,049 posts)
13. It runs into a similar problem
Thu Apr 23, 2026, 07:36 AM
Thursday

Just about everyone in the medical profession is googling. They're usually only doing so for some random, obscure things they don't deal with everyday that they may have forgotten over time, or just to confirm something real quick. But they have the education to understand where to look for the correct answer and whether or not what they're looking at is good information. If you're educated on the material, you'll know when something isn't right. Which is why patients who don't know these things can be so fun.

So it is with Sherpath and other aids. You can ask it questions like, "What are five signs of placental abruption?" And it'll pull from the textbooks.

But when you need human discernment and good human judgement, it can get chaotic. For example, a lot of NCLEX questions (or just nursing in general) can be, "Your patient presents with X problem. What's the first thing you want to do?" Do you give oxygen, fluids, administer a med with a standing order, etc.?

And that's when the side-eyeing starts to happen. I'll be looking at the answer AI is giving and going, "This doesn't seem right." Students don't yet have that ability to know if they're looking at the right answer. But the number of students who will just accept a wrong answer unquestioningly is . . . a lot more than you'd hope for.

If AI is going to be a thing in education, it feels like schools need to consider giving a semester long course on how to use it in a way that generates the correct answer and gives them the tools to recognize that it is the correct answer. Like courses in research that teach students how to laterally read for proper sources, information, credible citations, bias, etc.

Technology can help a great deal, but at the end of the day, the human education and judgement has to be there to make the final call. And that is not as much a component of all this as you'd want. Particularly when teaching people in healthcare.

Recommendations

2 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»Real-life horror story ab...»Reply #13