North Country HealthCare is the “only game in town” for care delivery in some of the regions it serves, said Dr. Jennifer Cortes, quality and population health medical officer at the federally qualified health center. The provider operates 13 primary care clinics and two mobile units, caring for 55,000 people across rural Northern Arizona.
Some communities are very remote — meaning patients may be forced to travel hours to reach specialty care — which makes recruiting providers a challenge, Cortes said.
That’s one area artificial intelligence could help. Adopting an AI scribe, which typically records providers’ conversations with patients and drafts clinical documentation, could alleviate some of clinicians’ administrative work and decrease burnout, she said.
“When ChatGPT first came out, I was like, ‘Oh my God, this might make things so much better for those of us working in this field,’” Cortes said. “I just want my job to not be so challenging all the time. It would be amazing if this works.”
But taking on an AI project isn’t easy for a safety-net provider. The technology can be labor intensive to implement, requiring technical expertise and oversight capabilities that many are unlikely to be able to easily access, experts say.
And if health systems with the fewest resources — that often care for the most medically complex patients — aren’t able to realize the benefits of AI, they could fall even further behind larger or more affluent providers.
“If you look at the kinds of health systems that are actively deploying AI right now, those that can afford it are the ones that are more aggressively pursuing it,” said Brian Anderson, CEO of the Coalition for Health AI, an industry group developing guidelines for responsible AI use in healthcare. “Those that are in rural communities, for example, that don’t have the IT staff to deploy and configure different kinds of AI tools aren’t able to do that. That’s an example of the digital divide already being reinforced in the AI space.”
‘A ton of human effort’
Adopting AI products at health systems can require specialized human labor and technology resources to safely implement, creating significant barriers for cash-strapped safety-net providers, experts say.
“People tend to talk about it or conceptualize it like you’re turning a light switch on,” said Paige Nong, assistant professor at the University of Minnesota School of Public Health. “It’s actually not that simple. These tools and these systems require a ton of human effort.”
Safety-net providers are likely to operate at slim margins, given their heavier reliance on Medicaid — a pressing challenge as the insurance program faces federal funding cuts — and higher uncompensated care demands.
For example, the net margin at community health centers, which provide primary care to underserved populations, was just 1.6% in 2023, according to health policy research firm KFF. That fell from 4.5% in 2022, driven by inflation and the expiration of pandemic-era funding.
Many community health centers also face workforce concerns, with more than 70% reporting a primary care physician, nurse or mental health professional shortage last year, according to the Commonwealth Fund. Meanwhile, labor costs are a significant expense for many providers.
And implementing AI will take plenty of work to manage. For example, health systems will need to set up AI governance structures that can evaluate products for safety and efficacy as well as maintain regulatory and legal compliance. Plus, providers should keep monitoring their AI tools, because the assumptions underlying the model, like patient characteristics, could change over time, potentially degrading its performance, experts say.
“It’s obvious when a scalpel gets rusty, you know you need to replace it or clean it,” CHAI’s Anderson said. “With a lot of these AI tools, we don’t necessarily know that yet. So how health systems can afford to do the kind of monitoring and management of these models over time is a real concern.”
Tech support
Additionally, providers will need IT staff with the technical expertise to handle the work needed to adopt AI tools, a particular challenge for cash-strapped and rural facilities that may struggle to attract the talent, experts say.
For example, many safety-net providers don’t have data scientists on staff, said Mark Sendak, population health and data science lead at the Duke Institute for Health Innovation. It likely doesn’t make sense for some of these care settings to employ them either, given the staff may command high salaries while not generating revenue from patient care, he added.
Financial challenges can also make it hard to invest in the IT infrastructure needed to adopt AI tools, said Jennifer Stoll, chief external affairs officer at health IT provider and consultancy OCHIN.
“Compounding the challenges for community health organizations is that many have no choice but to rely on outdated, inefficient technology systems,” she said via email. “Not only are these outdated systems all they have access to, but some are not even capable of integrating AI tools—only exacerbating the technology divide.”
Meanwhile, AI might be at the bottom of an IT team’s to-do list. For example, at North Country, Wi-Fi at the clinics doesn’t always work well. And its legacy electronic health record won’t be supported by its vendor in a few years, so the provider will have to transition to a new one.
“We’re missing even the basics,” Cortes said. “Even if you’re not talking about AI, we’re behind.”
Missing out on AI
These constraints are likely already impacting how low-resource providers implement AI, experts say.
For example, a study published early this year in Health Affairs found 61% of U.S. hospitals that used predictive AI models had evaluated them for accuracy using their own data, and just 44% locally evaluated their tools for bias — an important process to help health systems determine whether a tool will work well within their patient populations.
Hospitals that developed their own predictive models, reported high operating margins and were part of a health system were more likely to locally evaluate their AI products.
“You’ve got the resources, you’ve got an IT staff, you’ve got some data scientists who can design models or who can evaluate models from an EHR vendor,” said the University of Minnesota’s Nong, one of the study’s authors. “Resources were the critical necessary component for being able to really consistently conduct evaluation and also to design bespoke models.”
Without assistance, providers who lack the funds and technical capabilities to implement AI could miss out on the technology’s potential benefits or adopt AI without necessary safeguards. That could widen existing disparities between high- and low-resource providers and their patients, experts say.
For example, hiring and retaining staff is already a challenge for safety-net settings, Sendak said. When a new graduate from residency looks for a primary care job, would they rather take a position at a health system that could offer an AI documentation assistant — which could help alleviate provider burnout — or a clinic that’s unable to implement one?
Additionally, limiting AI uptake to the most well-resourced providers could reinforce biases inadvertently included in these tools, CHAI’s Anderson said. If data used to train algorithms continues to be only easily collected in urban, highly educated communities on the coasts, AI tools will miss out on information gathered from other groups.
“I think our job as a society is to make sure that we are making that as easy as possible, so that we have AI that can serve the individual in rural Appalachia or in a farming community in Kansas just as well as it serves people in San Francisco or people in Boston,” he said.
Help wanted
Still, there are methods that could help small and low-resource providers adopt AI products — including mentorship and support models that have been used for other emerging technologies, experts say.
For example, the HITECH Act, enacted in 2009 to promote the use of EHRs, included funding for Regional Extension Centers, which provided on-the-ground technical assistance for small primary care practices, community health centers and critical access hospitals.
Similarly, the Health Resources and Services Administration funds Telehealth Resource Centers, a group of 12 regional and two national centers that offer education and resources to providers looking to implement virtual care.
EHR vendors have a role to play as well, Nong said. Nearly 80% of the hospitals in her Health Affairs study used predictive models obtained through their EHR developer, so that could be an impactful point for helping providers safely deploy models, she added.
Larger health systems and academic medical centers could assist their small and resource-strapped counterparts too. The Health AI Partnership, a coalition that includes health systems like Duke Health and the Mayo Clinic, runs the Practice Network, which works with safety-net settings on adopting AI best practices with one-on-one support.
North Country is one of the inaugural participants in the network. The program provides technical support to safety-net organizations, helping them work through AI procurement, evaluation and implementation, said Duke’s Sendak, who serves on HAIP’s leadership council.
Still, the Practice Network currently works with five safety-net providers, while there are hundreds of federally qualified health centers across the country that may need support, he added.
“There is a big difference between where all the conversation is about AI in healthcare and where a lot of the people who are delivering healthcare are in terms of ability to adopt and use it safely,” North Country’s Cortes said. “When they’re talking about all these huge investments going into AI, great, that’s exciting, but how are they going to bring us along too?”