Voice-based documentation, Natural Language Processing (NLP), and Artificial Intelligence (AI) are capable of eradicating many issues while using Electronic Health Records (EHR) software.

Machine learning and Artificial Intelligence (AI) are the new features and applications that have been added to the healthcare industry. However, some of the healthcare providers have already been using some pieces of these features for a long time without even realizing it.

In many of the diagnostic disciplines of healthcare, like Pathology and Radiology, the voice recognition tools are mainly used for the documentation of clinical reports and notes into the Electronic Medical Records (EMR) software.

Voice-to-Text/Speech-to-Text Applications

Voice-to-text applications are also gaining popularity very fast in practices and in the primary healthcare industry. These speech-to-text applications are based upon NLP (Natural Language Processing), which is a common kind of machine learning. Its purpose is to turn some kind of sound into a text.

The machine learning system is also used to identify many useful elements in the text, like the number of dosage for a drug and the name of the medicine.

For all the physicians and healthcare providers across the globe, natural language processing can revolutionize their interaction with the EMR system.

Senior VP of clinical information and Chief Information Officer at WellSpan, R. Hal Baker said, “Voice recognition can help move the EHR from its necessary function as a documentation tool for the business of medicine into a communication tool for the practice of medical care, the two functions are intertwined, but they are also distinctly different.  Voice tools and natural language processing make sure that our providers can convey meaning and context using the full breadth of the English language without succumbing too many of the challenges we often see with EHR use.”

It does not matter how smooth or productive the workflow is designed, there always remain some issues in asking doctors, nurses and other healthcare providers to go through some very complex tasks in one time.

Providers should be giving patients the best treatment without dividing their attention. They should make use of all the features and applications provided in the Electronic Health Records (EHR) software system.

Baker further said, “Everyone’s capacity for attention is limited, that’s why we tell people not to text and drive.  You simply cannot stay focused on both tasks at once, and one is a lot more critical for safety and getting where you’re going than the other.” He also said that “Texting and treating is exactly the same.  You’re asking a provider to manage two discordant tasks at the same time.  They compete for focus, and as a result, you’re going to miss important parts of both.”

Some other experts need to be more competent with multitasking.

While criticizing what mainstream executives do, he said, “I know very few board chairs or senior executives who try to type notes when they’re running a business meeting, it’s not what they’re in the room to do.  It’s the same with healthcare providers – being able to sit at the keyboard and type notes is very rarely what attracted these people to medical school or nursing school.”

Across the country, the burnout rate is reaching high levels. Recently, a survey was conducted, which shows that 83% of the organizations in the healthcare industry are unable to meet the expectations of the patients when it comes to documentation and administrative issues.

“Time has turned into the currency of healthcare in the modern era. Right now, the amount of time spent looking at a screen and clicking a mouse is becoming unsustainable. We need to start employing new strategies to solve the problem.”

The tools of voice recognition can be more reliable for the new tactic. “The promise of voice recognition goes beyond dictating clinical notes as a replacement for typing or hand-writing them, in a perfect world, we’ll be able to have a narrative conversation without even thinking about how it’s being recorded,” he stated.

Interaction with the EMR and Practice Management (PM)

The mainstream interaction with the EMR and Practice Management (PM) software cannot be completely replaced with the natural language processing tools. This is because these tools are not refined enough yet. However, Baker strongly believes that they are ready for a go.

He further asserted that “Products like Alexa, Siri, and Google Home have shown us that voice recognition with AI behind it can do a pretty good job of following verbal instructions. As the industry refines those capabilities, it’s becoming much less of a leap to think that I’ll be able to say, ‘We are going to put Mrs. Smith on 500mg of amoxicillin, four times a day for seven days.  Send that prescription to the Walgreens on Queens Street.’”

Those two sentences have a very simple meaning, and the current virtual assistants may be able to carry out a certain role if they achieve HIPAA capability in the coming years.

“But if I were to enter that order into the EHR myself, it would take me somewhere between 8 and 15 clicks and several keystrokes. It would be a major benefit to my relationship with my patient if I could simply say that sentence out loud, confirm it with the patient, and continue our conversation without losing my focus on the person in front of me.” Baker mentioned.

Natural Language Processing and Voice Tools

The healthcare industry is at the initial stages of reaching that vision, but natural language processing and voice tools are already working on providing a better interactive platform for patients and the providers and on lessening the complex work with the electronic health record software.

At WellSpan health organization, Nuance voice recognition tech and Epic EHR software have teamed up to provide more natural and efficient patient-provider conversation platform.

Baker said, “The goal is to do things with the patient, not to them. “I’m a primary care provider by background, and when I dictate my notes in front of the patient, he or she gets to hear what I’m saying and make sure that it’s correct.  If I’m wrong, I can just go back and fix the error right there with their confirmation. It’s a much more cooperative approach – not to mention a more efficient one.  I can talk to both the record and the patient at the same time, so I don’t have to walk out of the room and recount the entire visit again at some later time.  That lets me spend a greater percentage of my time in the patient’s presence.”

Wellspan has acquired a very cooperative approach by taking active part in OpenNotes, which helps patients to access their whole health record from a patient portal, immensely reducing their frustrations.

“We find that being transparent with the patient from the beginning of the documentation process is a significant benefit,” said Baker.  “People feel more invested in their care, and even more confident in their provider and their data because they participated in the process of creating their own record and they have experienced their provider listening to them. Patients have a baseline expectation that they’re being listened to, but there are a lot of situations where that isn’t completely evident.  It’s very clear that they are the provider’s priority when they’re hearing their story repeated back to them.  It’s a much different experience than asking the patient to wait quietly while the provider puts his head down and types for five minutes.”

It is not that only the patients and the providers take advantage of the dictation tools. This type of documentation is reliable, efficient and of very high quality, and can be used more handily for analytics downstream.

If the NLP tools are allowed to point out important features within the text, and deduce these elements into data formats, then this helps providers to interact naturally with the health records, according to Baker.

Human Medical Scribes

The human medical scribes have been gaining popularity amongst healthcare providers who want to reduce multitasking work, without using the advanced abilities of virtual assistants.

According to the American College of Medical Scribe Specialists, the profession is going to see immense growth in the near future.

In 2015, about 15,000 scribes were active in hospitals and ambulatory settings. However, by 2020, the number is expected to reach 100,000 as clinics want more help with documentations, according to the organization.

Baker acknowledged that, “Scribes can be a very viable option. Especially because humans still have a better ability to interpret subtleties of language than a virtual assistant.  Scribes have been used effectively in several settings to improve the efficiency of providers, and they can play a valuable role.”

“But I believe the patient-provider dynamic subtly changes when there’s a third person in the room,” he added.  “The sense of confidentiality changes, through no fault of the scribe themselves.  You could think about utilizing a human scribe remotely, through video or audio, but then you are running into new questions of data privacy and security, not to mention infrastructure investment.”

He further said about the scribes that they are only human, and can have risks of getting confused as the provider.

“In contrast to people, computers are eternally vigilant. They don’t accidentally tune out; they don’t think about what’s for lunch.  Computers might make mistakes, but we can go back into the records and look at exactly what the mistake was and why it was made – and we can improve their programming so that they won’t make that mistake again,” said Baker.

“It would be a very different world if we could do that with humans, but we can’t.  So there’s an advantage there to using virtual assistants or ambient computing devices that can take some of the variability out of the equation.”

There is still some work to be done to fully activate voice recognition virtual assistants for routine use, NLP tools are already live and are working to improve providers’ interaction with their electronic health record software.

“Voice has untapped potential to keep improving the provider experience, as well as the patient experience,” he said.

“I believe this is a very good place to be putting the creative energy of healthcare, because provider exhaustion and burnout are affecting nurses, physicians, and just about everyone else involved in care right now.  We need creative solutions, and I firmly believe voice-based tools are going to be a major part of that process.”