EHR error rates vary widely across systems, JAMIA study finds
The amount of time it takes for providers to complete EHR tasks varies widely across health IT systems, and so does the error rate and amount of work required, according to a recent study in the Journal of the American Medical Informatics Association.
The study collected keystroke, mouse click and video data from two different EHR vendors, Epic and Cerner, across four healthcare systems. Between 12 and 15 emergency medicine physicians participated from each site, completing six EHR ordering scenarios: two diagnostic imaging, two laboratory and two medication tasks.
Error rates varied by task but reached as high as 50%. For certain tasks, there was an eightfold difference in clicks and a ninefold difference in time.
The gaping variability across EHR vendors, tasks and healthcare systems highlights the need for improved standardization across systems and better implementation practices — especially since these factors are critical to the safety of the product and the patient.
Many fast decisions are made during the rollout of an EHR, but it can take several months to a year and a half to complete. That affects usability — the efficiency and effectiveness of the technology in the hands of a clinical user. Problems with usability in areas such as diagnostics or medication can contribute to physician burnout and patient dissatisfaction, along with increased risk to quality of treatment.
The study showed errors across three sectors: imaging, labs and medication. It cited usability challenges such as “screen displays that have confusing layouts and extraneous information, workflow sequences that are redundant and burdensome, and alerts that interrupt workflow with irrelevant information” as reasons for the errors.
Study author and scientific director at MedStar Health's National Center for Human Factors in Healthcare, Raj Ratwani, told Healthcare Dive that, ultimately, “what’s contributing to these types of errors is the way the system is designed, developed and implemented.”
During implementation, providers make a number of decisions about how they want their health IT system set up, though they may not have the time or knowledge to make those decisions. This can result in “potentially dangerous” flaws in a system that may not meet a provider’s workflow needs, Ratwani said.
Ratwani, who is also an assistant professor of emergency medicine at Georgetown University, pointed out a second challenge: vendor organizations’ inability to clearly communicate the best implementation practices for a specific healthcare organization, as well as sacrificing usability to the demands of a (potentially uninformed) customer.
A confluence of these challenges among the hospitals’ Epic and Cerner systems likely contributed to the high error rate, the study concluded. That has far-reaching consequences, as the two companies together make up more than 50% of EHRs within U.S. hospital systems.
The Office of the National Coordinator of Health Information Technology put requirements in place in 2010 to promote EHR usability and has continued to expand the program. Vendors must attest they are putting the needs of end users at the forefront of software development and third-party certification bodies must approve their products. Still, usability challenges persist.
In a 2015 research letter, Ratwani found a lack of adherence to ONC certification requirements and usability testing standards among several widely used — and certified — EHR products. Other extraneous factors can shape the final manifestation of the EHR. During local site implementation, for example, configuration and customization impact layout and and information accessibility and account for some of the variability across healthcare systems.
The latest study’s authors suggest stakeholders should consider basic performance standards for all implemented EHRs to ensure usable and safe systems, and Ratwani stressed the need for follow-through.
“Many vendors [have] their products certified as having met that requirement” even if they don’t, he said. “The ONC could modify that to say: Show us evidence of your user-centered design process. And if vendors are doing this, showing evidence is easy because if you’re doing it there are natural byproducts of it.”
All of the health information products examined in the study were usability tested by either Cerner or Epic and were certified by the ONC’s accrediting bodies.
Similarly, Ratwani believes that “rethinking [physician] training is going to be important.” That includes revamping the copious training physicians are bombarded with in a clinical setting, especially because clinicians sometimes get their health IT training on a product that “doesn’t even resemble the real product they’re going to be using.”
Rawani urges that multiple stakeholders be involved, noting that solutions can come from both vendors and providers.
Follow Rebecca Pifer on Twitter