As seen in Psychology Today.
ChatGPT stunned the world last November when it was released. The chatbot can generate seemingly accurate and even eloquent answers to difficult questions that typically require a thorough understanding of engineering, law, or medicine to answer. Trained using large language models and natural language processing, generative artificial intelligence (AI) systems like ChatGPT not only respond with answers to these questions; it seems as if their answers are the creations of a subject (i.e. an intelligent agent) rather than the output of a machine.
No surprise, this technology has inspired equal parts hope and consternation, as it holds immense promise but also represents such a radical departure from the status quo that we have yet to grapple with its potential impact on things like employment, disinformation, the arts, and social isolation. These questions are not just fodder for armchair philosophers, but serious policy questions that will have irreversible effects on society if they are not addressed before implementation becomes more widespread.
This is true within the field of psychiatry, as well. Generative AI can make clinical work less difficult, but it also has the power to potentially cause immense harm to our patients.
Where AI Can Help
The first thing that should be stressed is that AI is a tool. Like all tools, it is designed to make work easier for humans and to allow us to accomplish more in less time. Given existing trends, this could do a world of good for healthcare systems in the US.
More Face Time, Less Burnout
To better understand how clinicians spend their time, a 2016 study published in the Annals of Internal Medicine followed 57 physicians for 430 hours. Twenty-one of the participants completed after-hours diaries. What the authors found was that physicians spend 27% of their working time engaging in direct face time with patients in a clinical setting and 49.2% of their day dealing with desk work and electronic health records [EHR]. While in the examination room, only 52.9% of their time was direct clinical face time. Meanwhile, the 21 participants who completed after-hours diaries then reported spending an additional 1-2 hours of work performing mostly EHR tasks.
The study documents what anyone who works in healthcare has known for years: We are asked to do enormous amounts of administrative work and it eats into the time we spend with patients, as well as our lives outside of clinical settings. Reducing the amount of time spent performing clerical and largely routine tasks with the help of AI tools would increase the amount of time spent interacting directly with patients, as well as give clinicians more time to socialize and rest. Given the high rate of burnout among clinicians even before the COVID-19 pandemic, this latter point could help struggling healthcare systems retain skilled medical professionals.
Patient Management
In addition to rising rates of burnout, the number of new clinicians entering the profession has stagnated. A 2022 report from Definitive Healthcare estimated that 333,942 healthcare providers (including 117,000 physicians and over 53,000 nurse practitioners) left the field in 2021, and they are not being replaced at a pace fast enough to account for the additional needs of seniors and individuals adversely affected by the pandemic. Even before the pandemic, the Association of American Medical Colleges estimated that by 2033 there would be a shortage in primary care physicians of between 21,400-55,200 and a shortfall of nonprimary care specialists in the range of 33,700-86,700. Psychiatrists face additional problems because around 60% of them are 55 or older, meaning that more than half the profession will likely retire in coming years. Consequently, the shortage of psychiatrists may exceed 30,000 by 2050, according to a 2018 paper in Psychiatric Services.
The need for tools that make clinicians more productive is absolutely necessary if we are to ensure those who need care receive it, and ChatGPT offers some solutions by allowing clinicians to quickly access information to improve diagnostics and learn about new treatments. It may also help triage patients in emergency room settings.
Where AI Cannot Help
Though AI can be a very effective tool in the clinic and beyond, there are three distinct limitations with the technology’s use in healthcare.
Diagnostic Hallucinations
The first is very straightforward: ChatGPT has been known to report “hallucinations” as accurate and faithful responses to questions. Writing in Spectrum, reporter Craig S. Smith describes these hallucinations as “mistakes in the generated text that are semantically or syntactically plausible but are in fact incorrect or nonsensical.”
While generative AI may be a useful tool for medical professionals capable of determining fact from fiction, it cannot fully supplement sound medical judgment.
Tipping the Diagnostic Scale
Similarly, patients without any medical training may use ChatGPT to diagnose themselves. The danger here is twofold: ChatGPT may convince the patient that they have an illness with a specific set of symptoms. It may then provide patients with essentially a textbook list of symptoms for said illness. When they report these symptoms, and only these symptoms, to their clinician, the clinician may either misdiagnose them or follow a line of inquiry that could delay a correct diagnosis.
The Empathy Gap
Finally, ChatGPT has the power to transform medicine, but it has no bedside manners. It has no capacity for empathy. Divorced from emotion and pain, from the ineffable and ontic experience of being human, it can only generate an ersatz therapeutic alliance. For someone who needs a comforting word of encouragement or a reminder of how to immediately minimize the severity of a panic attack, AI chatbots powered by ChatGPT may be useful tools, though may still have limitations due to its lack of an empathy factor. However, for someone who needs assistance processing trauma or who is in crisis, it simply is not up to the task and may do more harm than good.
This article was written and edited by a human with no assistance from Generative AI tools.
0 Comments on "Artificial Intelligence in Psychiatry"