A 60-year-old man was recently hospitalized after taking dietary advice from ChatGPT. Concerned about the harmful effects of table salt (sodium chloride), he asked the AI chatbot for a substitute. Operating under the assumption that the suggestion was safe, he replaced all salt in his diet with sodium bromide—an outdated chemical that has long been removed from medical use due to its toxicity.
Over the course of three months, the man began to show alarming neuropsychiatric symptoms: paranoia, hallucinations, confusion, and refusal to drink water despite extreme thirst. He also exhibited skin issues and coordination difficulties. These signs led doctors to diagnose him with bromism—a rare form of bromide poisoning caused by the body’s buildup of this toxic compound.
Fortunately, with prompt treatment—fluids, electrolytes, and antipsychotic medications—the man stabilized and made a full recovery. He spent approximately three weeks under medical care and was discharged in stable condition. The case, published in Annals of Internal Medicine Clinical Cases, underscores the critical need for professional medical guidance and highlights the dangers of relying on AI for health-related decisions without expert oversight.
Who Should Teach Tomorrow’s Doctors? A Growing Debate
Man Hospitalized After Following ChatGPT’s Risky Diet Tip
വിമാനയാത്രയ്ക്കിടെ വയോധികക്ക് ഹൃദയാഘാതം; രക്ഷക്ക് എത്തിയത് ഉംറ തീർത്ഥാടക സംഘത്തിലെ ഡോക്ടർമാർ
NCB Arrests Two Doctors in Amritsar for Smuggling Banned Drugs; Five Others Booked
Bengaluru Doctors Employ Robotic Surgery to Remove Complex Lung Tumor
We have various options to advertise with us including Events, Advertorials, Banners, Mailers, etc.