60-year-old man asked ChatGPT for nutritional advice and ended up with a rare disease: bromism
A 60-year-old man ended up with bromism, a rare and ancient disease, after following ChatGPT's advice to replace salt in his diet
It is undeniable that artificial intelligence (AI) has become an everyday tool to resolve quick doubts, from how to organize a trip to what foods should be included in the diet. However, it's not always safe to follow its recommendations to the letter.
This is demonstrated by the case of a 60-year-old British man who ended up developing bromism, a rare disease, after consulting ChatGPT on how to reduce the salt intake in his meals.
The patient, concerned about his health, asked the AI ????for an alternative to common salt, with the goal of reducing his sodium consumption. The response was surprising: sodium bromide, a chemical compound that stopped being used in cooking more than 35 years ago and is harmful to the body, according to the Annals of Internal Medicine journal.
Currently, bromide is only used in swimming pool maintenance. Despite being a toxic substance, the man trusted the recommendation and consumed it consistently for three months.
Over time, he began to suffer symptoms such as fatigue, insomnia, acne, excessive thirst, and motor coordination problems. His mental state also deteriorated: he developed episodes of paranoia that required him to be admitted to a psychiatric facility.
After a series of tests, doctors concluded that he suffered from bromism, a poisoning that was considered almost eradicated in the 21st century.
Origin of Bromism
This illness was common between the 19th and 20th centuries, when bromide salts were prescribed to treat headaches, epilepsy, and anxiety. For decades, thousands of patients suffered serious side effects.
At its peak, up to 8% of psychiatric hospitalizations were linked to bromide use. Ultimately, the U.S. Food and Drug Administration (FDA) banned its use, closing that era of medicine.
In the case of this British patient, rapid medical intervention was key.Thanks to intensive intravenous fluid therapy and electrolyte correction, the man recovered in just three weeks.
Although he recovered, he was left with a valuable reminder: AI is not a substitute for professional medical advice. And by not asking about the risks of bromide, the machine also failed to warn about its dangers.

