by adgrooms on June 21, 2019

Providers need the ability to use their hands. Unfortunately, the current state of affairs requires more clicking and documenting than assessing and healing with touch. New technology is on the horizon that may free the hands with new ways to navigate clinical systems.

Eye tracking uses light and a sensor to track where a user is looking and for how long. Ideally (see what I did there?), eye tracking could be used as the control for the system. Glance to the left to go back. A prolonged look at a graphic enlarges it. Looking away dismisses it. A prolonged look at a control selects it. Meanwhile, EHRs could use eye tracking to improve efficiency and interface design, by observing what physicians use and what is a distraction, within specific work contexts. Learning from eye tracking could help to display only what is needed, cutting out the clutter.

Voice control is extensively available for home use now and is starting to make a clinical appearance in EHR with clinical documentation. Speech recognition can allow a physician to speak notes instead of typing notes. This may speed up the initial dictation, but it has been found that errors are more prevalent in speech dictation requiring time for review. To lessen the administrative burden on physicians, medical transcriptionists could be used to review and finalize the notes.

Motion control is has been used in the operating room for several years now. Uses in EHR could be beneficial because it removes the need to touch a keyboard or mouse. Reducing or removing contact with computer controls can reduce the risk of spreading infections. If implemented well with consideration put into simplifying the interface, it could also help speed up navigation and save time for physicians.

These hands-free technologies can be used to assist in taking EHRs to a more usable level while reducing the risk of cross-contamination. There is great potential for expanded use and adoption as we try to think of innovative ways to improve EHRs.