AI CEO reveals why it can be dangerous for health advice

Trending

AI CEO reveals why it can be dangerous for health advice | Latest Tech News

Perhaps it’s not value its salt when it comes to health advice.

A surprising medical case report revealed last month revealed that a 60-year-old man with no historical past of psychiatric or health situations was hospitalized with paranoid psychosis and bromide poisoning after following ChatGPT’s advice.

The unidentified man was in cutting sodium chloride (desk salt) from his diet. He ended up substituting sodium bromide, a poisonous compound, for three months upon session with the AI chatbot. Bromine can exchange chlorine — for cleansing and sanitation, not for human consumption.

Andy Kurtzig, CEO of the AI-powered search engine Pearl.com, reveals all the methods that AI can go fallacious in offering medical advice to a consumer. Courtesy of Pearl.com

“[It was] exactly the kind of error a licensed healthcare provider’s oversight would have prevented,” Andy Kurtzig, CEO of the AI-powered search engine Pearl.com, told The Post. “[That] case shows just how dangerous AI health advice can be.”

In a latest Pearl.com survey, 37% of respondents reported that their trust in medical doctors has declined over the past 12 months.

Suspicion of medical doctors and hospitals isn’t new — but it has intensified in latest years thanks to conflicting pandemic steering, considerations over financial motives, poor high quality of care and discrimination.

Skeptics are turning to AI, with 23% believing AI’s medical advice over a doctor.

That worries Kurtzig. The AI CEO believes AI can be useful — but it doesn’t and can’t substitute for the judgment, moral accountability or lived expertise of medical professionals.

Mistrust in the healthcare group has elevated considerably since the start of the COVID-19 pandemic. Valeria Venezia – stock.adobe.com

“Keeping humans in the loop isn’t optional — it’s the safeguard that protects lives.” he said.

Indeed, 22% of the Pearl.com survey takers admitted they adopted health steering later confirmed fallacious.

There are a number of methods that AI can go awry.

Mount Sinai examine from August discovered that well-liked AI chatbots are very weak to repeating and even increasing on false medical info, a phenomenon identified as “hallucination.”

“Our internal studies reveal that 70% of AI companies include a disclaimer to consult a doctor because they know how common medical hallucinations are,” Kurtzig said.

“At the same time, 29% of users rarely double-check the advice given by AI,” he continued. “That gap kills trust, and it could cost lives.”

Kurtzig famous that AI may misread symptoms or miss indicators of a severe condition, main to pointless alarm or a false sense of reassurance. Either means, correct care may be delayed.

In a latest Pearl.com survey, 23% of respondents reported believing AI’s medical advice over a doctor. Richman Photo – stock.adobe.com

“AI also carries bias,” Kurtzig said.

Studies show it describes men’s symptoms in more severe terms while downplaying women’s, exactly the kind of disparity that has kept women waiting years for diagnoses of endometriosis or PCOS,” he added. “Instead of fixing the gap, AI risks hard-wiring it in.”

And finally, Kurtzig said AI can be “downright dangerous” when it comes to mental health.

Experts warn that utilizing AI for mental health help poses important dangers, particularly for weak people.

AI has been shown in some conditions to present dangerous responses and reinforce unhealthy ideas. That’s why it’s important to use AI thoughtfully.

Pearl.com (shown right here) has human specialists confirm AI-generated medical responses.

Kurtzig suggests having it help body questions about symptoms, research and widespread wellness trends for your next appointment — and leaving the diagnosis and treatment choices to the doctor.

He also highlighted his own service, Pearl.com, which has human specialists confirm AI-generated medical responses.

“With 30% of Americans reporting they cannot reach emergency medical services within a 15-minute drive from where they live,” Kurtzig said, “this is a great way to make professional medical expertise more accessible without the risk.”

When The Post requested Pearl.com if sodium bromide may exchange sodium chloride in somebody’s diet, the response was: “I absolutely would not recommend replacing sodium chloride (table salt) with sodium bromide in your diet. This would be dangerous for several important reasons…”

Stay informed with the latest in tech! Our web site is your trusted source for breakthroughs in artificial intelligence, gadget launches, software program updates, cybersecurity, and digital innovation.

For contemporary insights, knowledgeable coverage, and trending tech updates, go to us frequently by clicking right here.

- Advertisement -
- Advertisement -

Latest News

- Advertisement -

More Related Content

- Advertisement -