One note. Courtney says that "While Chat GPT is credited with solving my son’s ‘problem’ … I truly credit a comprehensive medical file as the reason we found his diagnosis.* I hope someday to include artificial intelligence in PC Medfolio to allow others to utilize AI in a similar way that I did and I am closely watching legislation to implement it at that time."
Another note: There is no surprise in hearing a healthcare-related professional "authority" say that " “It is just too early for us to give generative AI to patients and expect that the results will be good.” There is nothing more dangerous to these people than autonomous patients. He confirms it in his ridiculous statement that would make you believe that he has the authority to tell me what I can and cannot do with LLMs. I am sure he is making a lot of money from his wide expertise but optimizing life expectancy or even patient self-assessments is definitely NOT part of what he does.
Great illustration of the potential power of ChatGPT with the right support: 1) patient and caregiver (mother) careful curation of extensive data, 2) using the tool to generate hypothesis about potential causes (a differential diagnosis list), and 3) human expert in the loop with patient to confirm the data findings, likely diagnosis, do any further confirmatory tests, and inform treatment options in consideration of patient situation and preferences (shared decision making). The patient knows more about their symptoms and its impact over time, and has more incentive to devote whatever time is necessary to solve the puzzle. Doctors have limited time and knowledge, and often don’t pay enough attention to what the patient is saying. The patient is in charge of their healthcare; they should be prepared to manage their physicians, to insist on what is important.
Hi Charlie! Thanks for the thoughtful comment. I believe what you're describing is what we call participatory medicine - patients and clinicians both contributing to help healthcare achieve its potential.
One note. Courtney says that "While Chat GPT is credited with solving my son’s ‘problem’ … I truly credit a comprehensive medical file as the reason we found his diagnosis.* I hope someday to include artificial intelligence in PC Medfolio to allow others to utilize AI in a similar way that I did and I am closely watching legislation to implement it at that time."
Another note: There is no surprise in hearing a healthcare-related professional "authority" say that " “It is just too early for us to give generative AI to patients and expect that the results will be good.” There is nothing more dangerous to these people than autonomous patients. He confirms it in his ridiculous statement that would make you believe that he has the authority to tell me what I can and cannot do with LLMs. I am sure he is making a lot of money from his wide expertise but optimizing life expectancy or even patient self-assessments is definitely NOT part of what he does.
Great illustration of the potential power of ChatGPT with the right support: 1) patient and caregiver (mother) careful curation of extensive data, 2) using the tool to generate hypothesis about potential causes (a differential diagnosis list), and 3) human expert in the loop with patient to confirm the data findings, likely diagnosis, do any further confirmatory tests, and inform treatment options in consideration of patient situation and preferences (shared decision making). The patient knows more about their symptoms and its impact over time, and has more incentive to devote whatever time is necessary to solve the puzzle. Doctors have limited time and knowledge, and often don’t pay enough attention to what the patient is saying. The patient is in charge of their healthcare; they should be prepared to manage their physicians, to insist on what is important.
Hi Charlie! Thanks for the thoughtful comment. I believe what you're describing is what we call participatory medicine - patients and clinicians both contributing to help healthcare achieve its potential.