**Can Generative AI Emulate and Improve Upon Our Basic Humanity in Healthcare?**
Generative artificial intelligence (AI) has made significant advancements in the field of medicine, raising the question of whether there is anything left about our basic humanity that AI cannot emulate and even improve upon. Recent achievements, such as Google’s Med-PaLM 2 scoring an “expert-level” 86.5% on the U.S. medical license exam and ChatGPT writing clinical notes indistinguishable from human-written ones, have challenged skeptics who believed AI could never replace certain human professions. However, in healthcare, clinicians argue that chatbots will never match their levels of compassion, empathy, and trustworthiness, which are considered distinctly human traits. Nevertheless, new research suggests that machines are rapidly gaining ground in these areas as well.
**AI Now Boasts Strong EQ**
At the University of Texas in Austin, a team was tasked with helping clinicians communicate more compassionately and effectively with patients who abuse alcohol. When no progress was made, the department head turned to ChatGPT for assistance. The generative AI system not only wrote an excellent letter that was sincere and considerate but also avoided using medical jargon that often hinders patient adherence to treatment plans. Social workers at the university then asked ChatGPT to rewrite the communication for a fifth-grade reading level and translate it into Spanish, resulting in improved clarity and appropriateness. Other clinicians who have utilized chatbots to script empathetic remarks for patients have been equally impressed, with one doctor commenting that the results “blew me away.”
**How Doctors Learn (And Unlearn) Empathy**
Empathy and compassion have long been considered innate human traits, although they can be fostered and developed over time. Medical school applicants often cite their desire to help people, make connections, improve lives, and serve the underserved as motivations for pursuing careers in medicine. However, the culture of medicine can erode these compassionate and empathetic qualities during the course of clinical training. As young doctors observe their professors and attending physicians, they learn that non-clinical details about a patient’s life and connecting with concerned family members are often dismissed as unimportant. These interpersonal activities are viewed as a waste of time compared to studying textbooks and mastering technical skills. Consequently, after years of neglect, these “softer skills” diminish.
**The Reality Of Medical Practice Today**
Physicians acknowledge the importance of the doctor-patient relationship but find it increasingly challenging to invest time in cultivating that bond due to the demands of modern healthcare. Economic pressures force physicians to see more patients each day, resulting in limited time for each individual. On average, doctors spend a mere 17.5 minutes with each patient and tend to interrupt within 11 seconds to maintain efficiency. While doctors genuinely care about their patients, rapid-fire exchanges can leave patients feeling uncared for and rushed. Many patients report having seen doctors who lacked compassion and felt hurried during their appointments.
**How Tech Bests Humans Emotionally**
Contrary to anecdotal notions, a recent study published in the Journal of the American Medical Association found that AI-generated responses to medical questions were judged to be more nuanced, accurate, and empathetic than those provided by doctors. Healthcare professionals, unaware of whether the answers were generated by a human or a bot, concluded that 80% of AI responses surpassed those of human doctors. It is particularly surprising that while less than 5% of doctor responses were considered empathetic, 45% of AI responses were regarded as empathetic. However, current versions of generative AI still have limitations, such as relying on outdated medical data and occasionally providing incorrect information.
**The Future of AI in Healthcare**
Generative AI models continue to progress, becoming faster, smarter, and more powerful. As these models learn and improve, they will not only become more accurate but also more empathetic. While the majority of patients still prefer human doctors over AI, the growing physician shortage and the advancements in AI technology may change this preference. Patients may turn to AI for timely medical care when the answers they receive are accurate and compassionate. Already, doctors are utilizing generative AI to assist with various healthcare tasks, from writing letters to insurers to double-checking diagnoses. However, if doctors fail to demonstrate empathy, sympathy, and respect in ways that build patient trust, generative AI may fill the gap, gradually reducing the role of humans in medical care provision.
In conclusion, generative AI has shown significant potential in emulating and improving upon certain aspects of our basic humanity in healthcare. While doubts remain regarding its ability to fully replace human clinicians, AI’s capacity for compassion, empathy, and understanding continues to evolve. As the field progresses and AI becomes more proficient, patients may increasingly turn to AI for medical expertise, especially when faced with limited access to timely care. The role of humans in healthcare provision may diminish over time, necessitating a balance between the capabilities of AI and the importance of human connection and understanding in the doctor-patient relationship.