Account

The Actual News

Just the Facts, from multiple news sources.

Should you really trust health advice from an AI chatbot?

Should you really trust health advice from an AI chatbot?

Summary

People are using AI chatbots like ChatGPT for health advice because they are easy to access and can give quick answers. However, studies show these chatbots often give wrong or misleading medical advice when people share incomplete or unclear information.

Key Facts

  • Abi, a user from Manchester, uses ChatGPT for health advice and finds it more personalized than internet searches.
  • ChatGPT sometimes gives helpful advice, like recommending a pharmacist visit for a urinary tract infection.
  • In a more serious case, ChatGPT incorrectly advised Abi to go to A&E urgently, causing unnecessary hospital wait time.
  • England’s Chief Medical Officer, Prof Sir Chris Whitty, says chatbot answers can sound confident but often are wrong.
  • Research by Oxford’s Reasoning with Machines Laboratory found chatbots are 95% accurate when given full detailed information.
  • When people interact naturally with chatbots and give partial information, accuracy drops to 35%.
  • Human conversation style, where information is shared little by little or incompletely, causes chatbots to make mistakes.
  • Serious conditions like brain bleeding can be missed or misdiagnosed by chatbots depending on how symptoms are described.
Read the Full Article

This is a fact-based summary from The Actual News. Click below to read the complete story directly from the original source.