Chatbots for medical advice: Three ways to avoid misleading information

  • 📰 medical_xpress
  • ⏱ Reading Time:
  • 58 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 27%
  • Publisher: 51%

Health Health Headlines News

Health Health Latest News,Health Health Headlines

We expect medical professionals to give us reliable information about ourselves and potential treatments so that we can make informed decisions about which (if any) medicine or other intervention we need. If your doctor instead 'bullshits' you (yes—this term has been used in academic publications to refer to persuasion without regard for truth, and not as a swear word) under the deception of authoritative medical advice, the decisions you make could be based on faulty evidence and may result in harm or even death.

, we looked at ethical perspectives on the use of chatbots for medical advice. Now, while ChatGPT, or similar platforms, might be useful and reliable for finding out the best places to see in Dakar, to learn about wildlife, or to get quick potted summaries of other topics of interest, putting your health in its hands may be playing Russian roulette: you might get lucky, but you might not.

This is because chatbots like ChatGPT try to persuade you without regard for truth. Its rhetoric is so persuasive that gaps in logic and facts are obscured. This, in effect, means that ChatGPT includes the generation of bullshit.in the sense of actually recognizing what you're asking, thinking about it, checking the available evidence, and giving a justified response. Rather, it looks at the words you're providing, predicts a response that will sound plausible and provides that response.

This is somewhat similar to the predictive text function you may have used on mobile phones, but much more powerful. Indeed, it can provide very persuasive bullshit: often accurate, but sometimes not. That's fine if you get bad advice about a restaurant, but it's very bad indeed if you're assured that your odd-looking mole is not cancerous when it is.

Another way of looking at this is from the perspective of logic and rhetoric. We want our medical advice to be scientific and logical, proceeding from the evidence to personalized recommendations regarding our health. In contrast, ChatGPT wants to sound persuasive

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 101. in HEALTH

Health Health Latest News, Health Health Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Incurable dog disease on the rise and now three humans have caught itNumbers are soaring due to imports from Eastern Europe.
Source: MetroUK - 🏆 13. / 82 Read more »