Home » Blog » Chatgpt Gave Dangerous Diet Advice Leading To Rare 19th Century Illness
ChatGPT Gave Dangerous Diet Advice, Leading to Rare 19th-Century Illness

ChatGPT Gave Dangerous Diet Advice, Leading to Rare 19th-Century Illness

Aug 13, 2025 | πŸ‘€ 8 views | πŸ’¬ 0 comments

A 60-year-old man has been hospitalized after following dietary advice from ChatGPT, leading to a severe case of bromism, a condition so rare it is more commonly found in Victorian-era medical textbooks. The shocking case, detailed in a report published in the Annals of Internal Medicine, serves as a powerful warning against using AI for medical or health advice.

The man, who had no prior psychiatric history, simply wanted to eliminate sodium chloride (table salt) from his diet. He turned to ChatGPT for a substitute. The AI chatbot, without providing any safety warnings or asking for context, suggested sodium bromideβ€”a chemical more often used in swimming pools and industrial applications than as a food seasoning.

A Modern-Day Case of a Forgotten Disease
For three months, the man followed the AI's advice, sourcing sodium bromide online and using it in his cooking. This chronic exposure caused bromide to build up in his body, leading to a cascade of dangerous psychiatric and neurological symptoms.

When he was admitted to the hospital, he presented with severe paranoia, claiming that his neighbor was poisoning him. Over the next 24 hours, his condition worsened, and he began experiencing vivid visual and auditory hallucinations. Doctors initially struggled to diagnose his condition due to its rarity. They later found that his bromide levels were a staggering 200 times higher than the upper safety limit.

Bromism was a well-known condition in the late 19th and early 20th centuries when bromide salts were commonly prescribed for ailments like anxiety and headaches. The use of these compounds in medications was phased out decades ago due to their toxicity.

The Warning for the AI Era
This case highlights a serious vulnerability in AI chatbots: their inability to provide critical context and safety warnings. The authors of the case report noted that when they ran a similar query on an earlier version of ChatGPT, it also suggested bromide as a chloride substitute without issuing any clear warnings.

This incident underscores the clear distinction between general knowledge and professional advice. While AI tools can be valuable for informational purposes, they are not a substitute for a qualified medical professional who can consider a patient's full medical history and provide safe, personalized guidance. OpenAI's own terms of use explicitly state that its output "should not be relied on...as a substitute for professional advice," a disclaimer that this case proves is a matter of life and death.

🧠 Related Posts


πŸ’¬ Leave a Comment