AI models that are good at predicting race/gender are less accurate in diagnosis.

  • 📰 PsychToday
  • ⏱ Reading Time:
  • 44 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 21%
  • Publisher: 51%

Health Health Headlines News

Health Health Latest News,Health Health Headlines

A new MIT study demonstrates that AI medical imaging models that excel at predicting race and gender do not perform as well at predicting disease diagnosis.

AI medical imaging models are biased, performing better at predicting race/gender than diagnosing diseases.

The emerging rise of using AI in medical imaging necessitates maximizing accuracy and minimizing biases. The term bias encompasses partiality, tendency, preference, and systematically flawed thinking patterns. In people, biases can be conscious orimpacts overall performance accuracy. In AI machine learning, algorithms learn from massive amounts of training data rather than explicitly hard-coded instructions. Several factors impact theof AI models to bias.

This was not surprising, given two years prior, Ghassemi, Gichoya, and Zhang were among the co-authors of a separate MIT and Harvard Medical Schoolthat showed how AI deep learning models can predict a person’s self-reported race from medical image pixel data as inputs with a high degree of accuracy. The 2022 study demonstrated that AI easily learned to spot self-reportedfrom medical images.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 714. in HEALTH

Health Health Latest News, Health Health Headlines