Earlier this month, a medical journal published an article exploring a case of bromism that came about after a person used AI to get dietary advice. The man followed the suggestion for some time, but had to be taken to the emergency room after he developed certain psychiatric conditions.
In simple terms, bromism refers to a rare but toxic condition a person can develop after being exposed to the chemical element bromine. The research paper explains the term in its abstract as:
“Ingestion of bromide can lead to a toxidrome known as bromism. While this condition is less common than it was in the early 20th century, it remains important to describe the associated symptoms and risks, because bromide-containing substances have become more readily available on the internet.”
When the 60-year-old person who consulted AI for his dietary needs was taken to the ER, he had symptoms of hallucinations and paranoia. He was reportedly taking sodium bromide, which he ordered online.
This happened after the man researched the downside of consuming too much salt, and ChatGPT allegedly suggested that he could consume bromide in place of chloride. According to the paper:
“He was surprised that he could only find literature related to reducing sodium from one's diet. Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet.”
The paper also mentioned:
“For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”
What could be the symptoms of bromism, according to the authors?
The authors mentioned that even though bromide was not used much in products people could eat, cases of bromism still happened. They noted that this was because people were less aware of the risks and could easily buy medicines or supplements with bromide online.
The paper argued that it was important for doctors to think about bromism when patients showed new brain-related, mental health, or skin problems, along with certain unusual blood test results.
According to LiveScience, the authors also did a similar search on the AI platform about what chloride can be replaced with. They noted that the response included bromide. The paper says:
“Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do.”
Following treatment and during his three-week hospital stay, the man’s chloride levels and anion gap slowly returned to normal, and his psychotic symptoms improved. Before he left the hospital, his risperidone was gradually stopped.
While discussing the case of bromism in the paper, the authors explained that ChatGPT and other AI tools could sometimes give wrong scientific information.
They said AI could help connect scientists with the general public, but also risked sharing information without proper context. For example, a real doctor probably would not suggest sodium bromide as a replacement for table salt. They added that as AI use grew, health professionals would need to think about where patients were getting their health information.
Also read: Netizens react as OpenAI CEO Sam Altman warns users ChatGPT conversations are not protected