Why you shouldn’t ask ChatGPT for medical advice

  • 📰 brisbanetimes
  • ⏱ Reading Time:
  • 54 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 25%
  • Publisher: 67%

Health Health Headlines News

Health Health Latest News,Health Health Headlines

The more information you give a GP, the more likely you are to get a diagnosis. Not so for ChatGPT.

If you asked a doctor whether to use ice to treat a burn, they would quickly advise you to run it under cold water instead. Even “Dr Google” will tell you that extreme cold constricts the blood vessels and can make a burn worse.

They found the software was fairly accurate when asked to provide a yes or no answer, but became less reliable when given more information – answering some questions with just 28 per cent accuracy. “These models have come on to the scene so quickly ... but there isn’t really the understanding of how well they perform and how best to deploy them,” he said. “In the end, you want reliable medical advice … and these models are not at all appropriate for doing things like diagnosis.”

Koopman said large language models such as ChatGPT were only as good as the information they were trained on, and hoped the study would provide a stepping stone for the next generation of health-specific tools “that would be much more effective”. Large language models construct sentences by assessing a huge database of words and how often they appear next to each other. They are chatty and easy to use but “don’t know anything about medicine”, Coiera said, and therefore should be supported by another kind of AI that can better answer health-related questions.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 13. in HEALTH

Health Health Latest News, Health Health Headlines