One of my favourite movies; we need more finance movies considering the outsized direct/indirect impact financial industry has on our lives + hits closer to home too.
My takeaway isn't how useful ChatGPT is, but how useless the doctor is, and by extension, how poorly the the field of medicine is that it doesn't adequately train doctors to have better bedside manner.
Regardless of my state of agitation, I would most definitely not want to be read a script prepared by an AI. In fact, were I to find out after the fact, I'd be even more incensed.
I passed this article & discussion on to physician spouse, who would specifically like me to reply to this comment: "You are being very emotional."
Digging deeper (as the comment above alone isn't worthy of HN, IMO): "People believe what they want to believe because of small sample sizes. When there isn't a black & white answer people go back to their personal experience, which is limited. They will not be right in the way you want to be right if you're practicing medicine." The family members are certainly exhibiting this. Are any of us, in this discussion?
Additional question: "Who is the patient? Is it the anxious family members who are being treated, or the person with fluid in the lungs?" The person with fluid in the lungs is the patient and the doc needs to be spending time on that person (the one who will die if not treated correctly). The family members are important but not in need of medical care. They can be handed off to an AI in the short term.
From my own vantage point, I've observed that the context switching between "caring for patient" and "dealing with family" can put a substantial drag on cognitive function for physicians. They do need to suck it up and deal with it since family members are the primary way in which patients receive post-release care in the US due to the non-existent safety net, and family member presence can really improve outcomes, but I'm not surprised that ChatGPT helps in producing the sort of bland prose that is well-received by family members.
This sort of response certainly seems indicative of a doctor with horrifically bad bedside manner. Families tend to be emotional when the health of their loved ones is at stake. I'm skeptical that a chatbot would help someone who can't grasp that.
The response is pretty clear and 100% correct: family is important but not in need of medical care. Making doctors focus too much attention on the needs of third parties will worsen the outcomes for actual patients.
Doctors are, on average, quite empathetic people. But humans in general are what they are, and the politics and economics of medicine prevent the doctors from saying the right thing[0] to overbearing or emotionally distraught third parties, so I can easily see how inserting a chatbot in between would improve the outcomes of actual patients.
(Though I am surprised there isn't dedicated staff for that yet. Or does it not work when people know they're talking to a PR doctor instead of the one treating their loved one?)
There is something comforting about how ChatGPT never becomes overly emotional. I'm not surprised it outperforms humans in bedside manner except for occasions where the human is particularly adept.
The cold metal of a machine is nothing next to the coldness of a doctor's heart! ;)