In a rare and concerning instance, a guy in the US who followed ChatGPT’s dietary recommendations became seriously ill with bromide poisoning.
According to a Gizmodo report, doctors think this might be the first instance of bromide poisoning linked to artificial intelligence.
Physicians from the University of Washington described the case in “Annals of Internal Medicine: Clinical Cases.”
According to them, the individual believed sodium bromide was a safe alternative to chloride in his diet, so he consumed it for three months. According to reports, ChatGPT gave him this counsel without alerting him to the risks.
Due to serious health hazards, bromide compounds were outlawed decades ago from usage in medications for anxiety and sleeplessness.
Nowadays, veterinary medications and a few industrial goods are the main sources of bromide. Bromide poisoning, commonly known as bromism, is incredibly uncommon in humans.
The man initially thought his neighbor was poisoning him, so he went to the medical room. His vital signs were OK, but he had hallucinations, was paranoid, and refused drink when he was thirsty.
Doctors had to put him under an involuntary psychiatric hold as his condition rapidly deteriorated into a psychotic episode.
He started to get better after getting intravenous fluids and antipsychotic medications. After he stabilized, he informed the physicians that he had contacted ChatGPT for table salt substitutes.
According to reports, the AI recommended bromide as a safe choice, which he heeded without realizing it was dangerous.
The man’s initial chat logs were not available to the doctors, but when they asked ChatGPT the same question later, it mentioned bromide once more without warning that it was dangerous for people.
According to experts, this demonstrates how AI can deliver information devoid of appropriate context or health risk awareness.
After three weeks in the hospital, the man totally recovered, and during a follow-up visit, he was in good health. Although AI can make scientific material more accessible, doctors have cautioned that it should never take the place of expert medical advice because, as this instance demonstrates, it can occasionally provide catastrophically incorrect advice.
Leave a Reply