- An artificial intelligence-based virtual scribe showed promise as a documentation tool but still required work to verify information and make corrections for accuracy, according to a study by researchers at the Rothman Orthopaedic Institute in Philadelphia.
- The study, presented at the 2023 American Academy of Orthopaedic Surgeons Annual Meeting, compared an AI scribe with human medical scribes, a transcription service consisting of a Dictaphone and a voice recognition mobile application that is part of electronic medical record platforms.
- Although the AI scribe understood most verbal commands and handled most medical documentation, it was at times unable to form a medical plan compared with humans.
Rothman’s research was directed towards addressing burnout among physicians’ clerical teams. Although electronic health records help physicians track patients over time, the extra work also creates stress for clinicians.
That stress is compounded by burnout among healthcare workers stemming from the COVID-19 pandemic and staffing shortages. In fact, 28% of physicians were likely to leave a practice because of EHRs or other IT tools hurting efficiency, according to Klas Research.
Three orthopedic hand surgeons evaluated 10 standardized patients with prewritten clinical vignettes for the study, titled Use of Artificial Intelligence for Documentation in Orthopaedic Hand Surgery.
The researchers carried out 118 clinical encounters using 30 AI scribes, 30 VRMs, 28 transcription services and 30 medical scribes. First they documented clinical encounters using both an AI scribe and a medical scribe followed by a VRM and a transcription service.
Researchers used a letter grade system according to an eight point scoring system. Then an attorney checked the notes for medical legal risk.
The AI was found to be inadequate at forming medical plans. It required verbalized narrative and needed verification and correction.
“Correction time of the AI transcription was notable,” Michael Rivlin, orthopedic surgeon at the Rothman Institute and associate professor at Thomas Jefferson University in Philadelphia, told Healthcare Dive. “However, we do not have enough data to fully quantify it and will be a focus of future studies. It was usable compared to a human scribe.”
However, the auto populated AI notes were faster than the VRM and transcription service. The VRM took 3.48 minutes and the transcription service 3.22 minutes.
As part of the test, physicians attempted to distract the AI by interrupting with irrelevant stories about a friend’s experience or a parent and a minor sharing thoughts, according to Rivlin.
So what will it take to make AI scribes more effective to help physicians? Rivlin said it will require training the AI like it were a child or a pet with “trigger words” along with other positive and negative data.
“For example, if a normal exam is mentioned and the AI documents a normal hand exam and the provider says, ‘The exam was normal, but one finger was missing,’ it should ‘learn’ to change the normal exam to 9 fingers were examined rather than 10,” Rivlin said.
The AI will continue to improve with advances in technology and does offer options for physicians to compare with the other methods of documentation, according to Rivlin. In the future, physicians will be looking for a tool even more fit for use with less manual intervention required, Rivlin said.
“Next steps for AI are continued improvement to create a more ‘turnkey’ service and increase reliance on AI with less user input,” Rivlin said.