The good, bad, and ugly of AI healthcare, according to a doctor who uses AI

3 hours ago 3
drfemalegettyimages-2167549218
Volha Rahalskaya/iStock / Getty Images Plus via Getty Images

Follow ZDNET: Add us as a preferred source on Google.


ZDNET's key takeaways 

  • People are turning to AI for health advice. 
  • It can get lots wrong. 
  • One doctor offers her advice on using AI. 

You can find health advice anywhere these days, regardless of credibility or medical expertise. 

This increased information availability has changed how people interact with medical professionals -- or whether they trust them in the first place. This broader access to health-related guidance also arrives amid historically low levels of trust in the healthcare system. A new poll from the Annenberg Public Policy Center finds that public trust in federal agencies like the Centers for Disease Control, the Food and Drug Administration, and the National Institutes of Health decreased by 5-7% over the past year. 

Whether or not the tech world is capitalizing on this declining trust, it's certainly making medical alternatives more convenient. The reality is that people are turning to this often free, always available, and quick-to-use technology for answers that a doctor or medical professional would once provide. A recent survey found that 63% of respondents find AI-generated health information reliable, according to Annenberg.

Also: Oura built a women's health AI using clinical research - how to try it

Google, OpenAI, and Anthropic, three of the major AI players, have built health-oriented large language models (LLMs) for healthcare professionals. Rumors are circulating that Apple could be developing its own health AI, and Oura just launched an experimental custom women's health LLM. 

For Dr. Alexa Mieses Malchuk, the technology has changed how her patients interact with her -- and how this family physician does her job.   

AI can give users thorough explanations and answers to every health query under the sun. But it can also get lots wrong. In an interview with ZDNET, Mieses Malchuk discussed the usefulness and pitfalls of health AI, and how patients should approach the technology.

How she uses AI 

Mieses Malchuk isn't AI-intolerant. In fact, she uses it to streamline administrative work, such as triaging patient messages and creating anticipatory guidance before a visit. AI companies continue to build more software for doctors and medical professionals. Just last week, Amazon and Google announced their own healthcare software products for scheduling doctors' appointments, clinical documentation, and medical coding. Administrative burdens in medicine have historically been an issue for doctors, who report spending more time completing paperwork than serving patients face-to-face. 

Also: OpenAI, Anthropic, and Google all have new AI healthcare tools - here's how they work

"There are really neat and cool things like that happening all over healthcare that have kind of streamlined the work of a primary care physician," Mieses Malchuk explained. Still, she's aware of the technology's limitations. 

AI as a springboard

For medical nonprofessionals, she recommends using AI as a springboard, not as the end-all, be-all for medical advice. It can be satisfying to immediately receive an answer from one of these chatbots, and sometimes the AI's response can provide a sense of certainty that assuages worries, but she reminds users that these tools cannot diagnose conditions -- and that most patients sifting through these responses aren't medically trained to know wrong from right. 

AI chatbot users may be omitting important information about their medical situations, leading to a fundamentally different diagnosis or treatment, Mieses Malchuk said. "Their responses are only as good as the questions we ask." 

"It's not that people without medical training shouldn't have access to AI. They should be partnering with their primary care physician to help sift through what they're finding online." 

Also: The Apple Watch missed my hypertension - but this blood pressure wearable caught it immediately

 As these AI health tools have grown in popularity, she's seen patients come to her less willing to share that they've done their own research using these tools -- but more certain about what they believe their diagnosis to be.

"Even in medicine, there's not always 100% certainty about anything. On one hand, it's great that we live in this day and age where we have access to information literally at our fingertips, but there are some real downsides to that," she noted. 

Mieses Malchuk fears AI tools like ChatGPT could give people a false sense of security, telling people they don't have to go to the doctor or get a condition examined. "That could be a missed opportunity to diagnose something early," she said. 

Among gold-standard emergencies, a recent study in Nature found that ChatGPT undertriaged over half of cases and directed patients to a 24-48-hour evaluation rather than the emergency department. "Our findings reveal missed high-risk emergencies and inconsistent activation of crisis safeguards, raising safety concerns that warrant prospective validation before consumer-scale deployment of artificial intelligence triage systems," the authors write. 

How AI can help patients 

Mieses Malchuk recommends using AI health tools for recommendations on general wellness advice. Maybe a patient was recently diagnosed with celiac disease and wants to know which foods they should and shouldn't eat. AI can create a meal plan, generate ideas, and provide helpful recommendations. It's also great for workout planning, and it's quite easy to create a customized workout regimen with the help of an AI tool. 

Also: Are AI health coach subscriptions a scam? My verdict after testing Fitbit's for a month

All in all, it's a great wellness tool for those without medical training. But leave the diagnostics and treatments to the professionals.

"Mistrust in the medical system is growing, which is really a travesty. We take this oath to first do no harm, so the idea that these other resources are giving patients this false sense of confidence and making them think they can completely bypass seeing a physician -- it's an unfortunate step point," Mieses Malchuk said.

Read Entire Article