GPT saved me yesterday. It helped me identify and verify a rare three-way undocumented medicine interaction that was causing anguish. The interaction was hypomagnesia and serious arrhythmia caused by a combination of berberine, famotidine, and vonoprazan. This was despite magnesium supplementation.
Two months ago it helped me accurately identify a gastrointestinal diverticulitis-type issue, find the right medication for it (metronidazole), which fixed the issue for good. It also guided me on identifying the cause, and also on pausing and restoring fiber intake appropriately.
Granted, it is very easy for people to make serious mistakes in using LLMs, but granted how many mistakes doctors make, it is better to take some self-responsibility first. The road to making a useful diagnosis can be windy, but with sufficient exploration, GPT will get you there.
The bad idea is to live and die in ignorance. The good idea is to use GPT to find ideas and references that one can then verify. If it were up to the medical establishment, they would block the public from accessing medical research altogether, and they already do this by paywalling much research.
Generally a bad idea. If you want to be a doctor, go to medical school.