Amazon's new Health AI Chatbot is rife with misuse potential

3 hours ago 8
mazon logo on the facade of one of their corporate office buildings located in Silicon Valley, San Francisco bay area (Image credit: Shutterstock / Sundry Photography)

Amazon has launched its new Health AI service in the US, a chatbot which can help you understand, treat and diagnose health conditions. Available this year to all US Amazon Prime subscribers, it can have conversations with you about your health issues, recommend fixes, health products and connect you with doctors.

Let's get this out of the way: I think AI has a place in healthcare. As medical staff battle creaking hospital infrastructure and overwhelming demand, patients suffer long waitlists and (in the US, at least) seemingly ever-rising costs brought on by for-profit pharmaceutical and medical industries. AI has the potential to ease all of those problems.

Article continues below

What exactly is Amazon Health AI?

Meet Amazon Health AI: a personalized health agent that connects you to One Medical providers - YouTube  a personalized health agent that connects you to One Medical providers - YouTube

Watch On

Amazon describes Health AI as "an agentic AI health assistant designed to make health care easier". It says Health AI is "designed to be a personalized health agent that knows you and your medical history so it can provide more helpful responses and take meaningful action, including connecting you to the professionals, treatments, and account services you need to get and stay well".

These services include recommending you Amazon Pharmacy products and connecting you to healthcare providers (specifically from Amazon's One Medical group). With permission, Amazon can also access your medical records and have the chatbot discuss them with you.

Amazon insists security is tight: it says Health AI is a "HIPAA-compliant" environment, referring to the US' Health Information Portability and Accountability Act. This means all your protected health information (PHI) is treated as it would be at a doctor's office, and subject to all legal privacy requirements.

What are the potential risks?

Amazon Pharmacy

(Image credit: Amazon Pharmacy)

The HIPAA Journal, in an article about AI published last year, said that healthcare providers and vendors using AI run risks as a result of the technology. It stated that "lurking surreptitiously behind the potential benefits of using PHI in AI technology lies a murky mix of risks that could negatively impact [healthcare providers], your vendors, and even your patients, especially when HIPAA compliance and patient PHI are involved."

Sign up for breaking news, reviews, opinion, top tech deals, and more.

Some of these risks involve the nature of AI models requiring huge corpuses of data to train. If you're building a health-based chatbot, you need a lot of health data to do so and Amazon, one of the world's biggest collectors of data, sees the value in your personal information.

Amazon insists "Protected health information from Amazon One Medical and Amazon Pharmacy is not used in the broader Amazon store to market general merchandise or by Amazon Ads, and Amazon does not sell customers' personal data" and "we only use protected health information for purposes permitted under HIPAA".

However, it's not hard to see the conflict of interest: it'll tell you the problem, and sell you the solution from its own website, or connect you to its own medical providers. While it won't be used in Amazon's general store, Amazon said nothing about using health information for marketing purposes in relation to Amazon Pharmacy. Did you tell the chatbot you're having trouble sleeping? Expect to have deals surfaced relating to over-the-counter solutions for better sleep.

While Amazon will record your data to train future models of its AI, it does say it will remove names. However, that in itself isn't properly private. Meta was found to be able to link Facebook accounts to users of the period tracking app known as Flo, even after names and other identifying account information had been removed, thanks to a unique identification number. It was found in court to be spying on Flo users. I'm bringing this up because I wouldn't trust name removal to be an effective way of anonymising accounts, when it comes to harvesting chat logs to train the next generation of AI.

If there are data breaches, or Amazon uses this data in a way that isn't HIPAA compliant, I also don't think any punishment can be effectively enforced. The Facebook court case I linked above took many years before reaching a verdict, and still no punishment has been handed out. Amazon is simply too big to fine, and too big to discipline, especially as its data centers are used by everyone from governments to grocers.

Big tech's desire to help build a happy, healthy populace is second to its desire for the profits it could wring out of your personal data.

With that in mind, Amazon isn't doing a good enough job at convincing me its new Health AI service is safe, private and in my best interests. The state of New York obviously feels the same way: it's in the process of blocking AI chatbots from giving legal or medical advice.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Matt is TechRadar's expert on all things fitness, wellness and wearable tech.

A former staffer at Men's Health, he holds a Master's Degree in journalism from Cardiff and has written for brands like Runner's World, Women's Health, Men's Fitness, LiveScience and Fit&Well on everything fitness tech, exercise, nutrition and mental wellbeing.

Matt's a keen runner, ex-kickboxer, not averse to the odd yoga flow, and insists everyone should stretch every morning. When he’s not training or writing about health and fitness, he can be found reading doorstop-thick fantasy books with lots of fictional maps in them.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read Entire Article