Prioritizing Physician Mental Health: A Critical Component of Effective Patient Care
When Dr. Dereck Paul was training as a medical student at the University of California San Francisco, he couldn’t help but notice how outdated and analog the hospital’s records-keeping systems were. According to him, the computer systems used in the hospital looked like they’d time-traveled from the 1990s, and to his amazement, many of the medical records were still kept on paper. This experience inspired him to found a San Francisco-based startup called Glass Health, which is now among a handful of companies hoping to use artificial intelligence chatbots to offer services to doctors.
Dr. Paul is convinced that AI chatbots can significantly reduce the paperwork burden physicians face in their daily lives, thereby improving the patient-doctor relationship. “We need doctors who are not in burnt-out states, trying to complete documentation. Patients need more than 10 minutes with their doctors,” he says. However, some independent researchers have expressed concerns that a rush to incorporate the latest AI technology into medicine could lead to errors and biased outcomes that might harm patients.
Professor Pearse Keane, a professor of artificial medical intelligence at University College London, UK, says that he is both excited and cautious about the potential use of AI in medicine. He argues that any technology that involves decision-making about a patient’s care must be treated with extreme caution for the time being.
Dr. Paul co-founded Glass Health in 2021 with Graham Ramsey, an entrepreneur who had previously started several healthcare tech companies. The company began by offering an electronic system for keeping medical notes. However, when ChatGPT appeared on the scene last year, Dr. Paul was initially dismissive of it. “I looked at it and thought, ‘Man, this is going to write some bad blog posts. Who cares?'” he recalls.
Despite Dr. Paul’s initial skepticism, younger doctors and medical students were using ChatGPT and saying it was pretty good at answering clinical questions. Then the users of his software started asking about it, and he began to take notice. Dr. Marc Succi, a doctor at Massachusetts General Hospital, who has conducted evaluations of how the chatbot performs at diagnosing patients, warns that doctors should not be using ChatGPT by itself to practice medicine. He says that when presented with hypothetical cases, ChatGPT could produce a correct diagnosis accurately at close to the level of a third- or fourth-year medical student. However, he adds that the program can also hallucinate findings and fabricate sources. “I would express considerable caution using this in a clinical scenario for any reason, at the current stage,” he says.
Despite the concerns, Dr. Paul believes that the underlying technology can be turned into a powerful engine for medicine. Alongside his colleagues, he has created a program called “Glass AI,” based on ChatGPT. When a doctor tells the Glass AI chatbot about a patient, it can suggest a list of possible diagnoses and a treatment plan. Rather than working from the raw ChatGPT information base, the Glass AI system uses a virtual medical textbook written by humans as its main source of facts. Dr. Paul believes this makes the system safer and more reliable. “We’re working on doctors being able to put in a one-liner, a patient summary, and for us to be able to generate the first draft of a clinical plan for that doctor. So what tests they would order and what treatments they would order,” he says.
Dr. Paul believes Glass AI will help address the huge need for efficiency in medicine. He says that doctors are stretched everywhere, and paperwork is slowing them down. “The physician quality of life is really, really rough. The documentation burden is massive,” he says. “Patients don’t feel like their doctors have enough time to spend with them, and physicians often don’t feel like they have enough time to properly care for their patients. This can lead to burnout, depression, and other mental health issues among physicians.
In addition to the long hours and paperwork, physicians are also dealing with the stress of making life-or-death decisions every day. They are constantly dealing with the physical and emotional pain of their patients, and it can be incredibly difficult to separate themselves from their work.
Furthermore, the current healthcare system in many countries is often based on a fee-for-service model, which means that doctors are paid for the number of procedures they perform rather than for the quality of care they provide. This can lead to doctors feeling pressured to order unnecessary tests or treatments in order to make more money.
All of these factors can contribute to a healthcare system that is strained and overburdened, with patients not receiving the care they need and doctors feeling frustrated and burned out. Addressing these issues will require a multi-faceted approach that includes changes to the healthcare system itself, as well as support for the mental and emotional well-being of physicians.