CHICAGO — The healthcare industry isn’t known for quick adaption to new technologies. But as the sector faces mounting challenges — like stressed providers, high costs and staffing shortages — executives are hopeful artificial intelligence could offer some relief.
Still, even as leaders look to adopt AI, the implementation process isn’t easy. And health systems will need to get their workforces on board to ensure success, experts said Thursday at the HIMSS AI Leadership Strategy Summit in Chicago.
The technology tends to look like a black box to some healthcare professionals, said Dr. Everett Weiss, medical director for health informatics at Rochester Regional Health, during a panel discussion. Doctors might not understand how AI works or generates recommendations.
That could prove challenging for some providers, given concerns that AI tools could offer inaccurate, misleading or biased information, potentially putting patient care at risk.
Clinicians’ credibility is on the line — so ensuring they can trust the tools is key, he added.
“Being able to build up that trust is hugely important,” Weiss said. “It is very easy to tear down trust. It is very hard to build that back up.”
That means health systems need to make sure their AI tools are reliable, through plenty of testing before the AI is fully adopted and educating users on how the products work, experts said.
Keeping a “human in the loop” — ensuring a person is involved in an AI workflow to double check its work — can help too, said Susan Fenton, vice dean for education for the department of clinical and health informatics at the University of Texas Health Science Center at Houston.
“No AI is licensed to practice. None, zero,” she said.
Where AI can succeed
One method for considering potentially successful use cases for AI is finding areas where the technology can complement human strengths, said Sagar Parikh, vice president of operations excellence and innovation at revenue cycle management company Ensemble Health Partners.
For example, an AI tool is probably much faster than a human worker at sifting through an entire electronic health record and pulling data points to support a response to a claim denial. But a nurse is likely better at providing the clinical context and choosing which facts are the most important, he said.
“That’s where I see language models and AI generally to be helpful,” Parikh said. “Doing the things that a human is historically not great at, and pairing it with things that humans are historically great at.”
Alleviating some of clinicians’ heavy administrative burdens is another common use case for AI, experts said. Providers have said they spend long hours on documentation and other EHR tasks, taking time away from patient care and eating into after-work hours.
AI documentation assistants are perhaps the “poster child” for improving clinician experience, Fenton said. But during a notetaking pilot in some clinics at UT Health Houston, the assistant only saved about nine minutes per day per provider — not a huge return on investment, she noted.
Still, the system decided to deploy the AI tool across the institution, because patient and provider satisfaction with the documentation assistant was high. That highlights the value of a pilot that measures metrics outside of finances, she added.
“It’s actually truly now improving that patient and provider experience,” Fenton said. “So I think that those are the things that we need to look for.”
Not all healthcare organizations need to be trailblazers when it comes to choosing an AI project either, Weiss said. If an AI project is doing well at other health systems, that could be a good place to start.
Additionally, organizations can look at places where AI has already been in use for a while, like using the technology in radiology to triage cases that need more immediate attention, he said.
Still, health systems should consider that not all AI tools work for every clinician. For example, while an AI notetaking assistant was popular with 9 out of 10 providers at his system, some clinicians didn’t see efficiency gains, Weiss said.
“We should not be forcing technology on our providers. These are tools to help them do their jobs even better,” he said. “We talk about the joy of medicine a lot in our organization, and so we want technology to be able to bring back that joy for all of our providers.”