Physicians Need to Get Involved in Usability Testing to Help Vendors Improve EHRs
Technology Feature — September 2015
Tex Med. 2015;111(9):39-43.
By Joey Berlin
If electronic health records (EHRs) are going to be more usable — and less frustrating — than physicians find them today, a multiyear research project at The University of Texas found EHR vendors will need to shore up their development processes.
But the future of improving EHR usability doesn't just sit at the feet of the vendors, researchers found; they'll need help from other stakeholders. That includes physicians, who need to be involved in EHR development before they become the perplexed end users who gnash their teeth and smack their monitors every time an EHR doesn't cooperate.
The University of Texas Health Science Center at Houston (UTHealth) led a four-plus-year effort to study EHRs' usability and workflow and propose better ways of designing EHR systems. An Office of the National Coordinator for Health Information Technology (ONC) grant funded the project, a product of the UT School of Biomedical Informatics' National Center for Cognitive Informatics and Decision Making in Healthcare (NCCD).
The project wrapped up last November, and researchers chronicled the results of their work in a book titled Better EHR: Usability, Workflow and Cognitive Support in Electronic Health Records and online.
Some of the research centered on testing EHR systems for usability and interviewing EHR vendors about their capabilities to build systems tailored to the needs of the user. The usability testing of commercial EHRs largely confirmed what many physicians report from their own experience: EHR systems are often difficult to use and hinder patient care in a clinical setting.
"What we learned was that it's not only the vendors' fault in terms of why these EHRs are not usable," said Muhammad Walji, PhD, associate dean for technology services and informatics at the UTHealth School of Dentistry and a primary researcher on the project. "It's kind of a shared responsibility that we have — the vendors, the researchers like us, as well as the people who are using it. And if we can all work together, we may be able to come up with better products."
In 2010, ONC provided $15 million over four years to fund a group of four Strategic Health Information Technology Advanced Research Projects, including a study of patient-centered cognitive support, known as SHARPC. The SHARPC project's primary goal was to address usability, workflow, and cognitive support problems with EHRs.
The EHR usability and workflow component was one of several subprojects.
Jiajie Zhang, PhD, SHARPC principal investigator and dean of the UT School of Biomedical Informatics, says that at one point during its lifespan, SHARPC involved approximately 100 people working at 12 different institutions. The researchers made sure physicians were an integral part of the process.
"They were the main investigators or supporting investigators or consultants in each of the projects that we had," Dr. Walji said. "So they were fully part of the team. Many of them participated in giving feedback as well as [being] participants in the research itself."
Kevin Hwang, MD, an associate professor of medicine at UT Medical School at Houston, says he took part in some of the project testing involving entry of prescription information into an EHR, such as medication reconciliation or change of dose. Researchers asked him to talk and think out loud as he reacted to performing different EHR processes.
"Then they tracked my movements, the mouse movements, and my eye movements with a camera," he said. "One of the things they were trying to look at was if there were wasted movements or if I was having to go back and forth between different screens or things like that."
Through SHARPC, researchers discovered where potential problems lie for physicians struggling with commercial EHRs.
For example, the researchers used an analytical process known as rapid usability assessment (RUA) to inspect and evaluate EHR systems and to identify challenges to each EHR's usability. The Better EHR book details the results from evaluating five commercial EHR systems. (See "Measuring Usability.")
RUA involved a three-step process. First, researchers selected 12 EHR clinical tasks to evaluate, such as clinical summary, demographics, vital signs, and e-prescribing. The researchers then used a task analysis method known as keystroke level model (KLM) to predict the performance of each EHR system using completion times for each of the tasks, also known as "use cases." As Better EHR explains, "KLM predicts the time it takes for an expert … to execute keyboard and mouse inputs along with the associated cognitive overheads (e.g., thinking time or time taken to visually acquire objects on the screen)."
"If everything can be tracked, now we can evaluate the actual performance of a user [versus] the theoretical, optimal performance," Dr. Zhang said. "Now we can say, 'Wow, you're actually far away from what it could potentially do. So this task can be done in 30 seconds, and you're spending five minutes — something's not right.'"
The last step in RUA was an expert review process, which involved evaluating each system using seven design principles, known as heuristics. Examples of those principles include:
- Consistency, which measures whether the EHR product consistently presents information the same way and requires consistent navigation methods;
- Feedback and error, which measures how well the product provides the user with feedback about the actions the user performs and how well the system prevents errors from occurring; and
- Undo, which evaluates whether the system can correct errors.
Researchers ranked each problem they encountered using a four-point scale. They assigned a ranking of one for a cosmetic issue, two for a minor usability violation, three for a major violation, and four for a catastrophic violation. (See "Usability Severity Rankings.")
The Better EHR book notes that researchers could perform just six of the clinical tasks in all five EHR systems, and that only two of the EHR systems could perform all 12. Vendors received anonymity in exchange for their participation in the project.
The completion-time analysis showed that among the use cases, clinical summary took the longest time on average to complete at 338 seconds, or about 5.6 minutes. The computerized physician order entry (CPOE) task took the next-longest time, with a mean of 326 seconds. The body mass index (BMI) task took the least time on average, with a mean of 16 seconds.
Across the five EHRs, the expert review portion identified 1,135 usability problems. Out of the 12 clinical tasks, the CPOE task produced the most usability problems, with a mean of 58 per system. Clinical summary was second with a mean of 53. The BMI task had the fewest, with a mean of nine usability problems per system.
Problem list and medication list tied for the highest average usability problem severity rating at 2.6. The growth chart task and BMI tasks had the lowest mean severity ratings at 2.3.
Ultimately, the project's evaluation of the EHR products' collective usability was consistent with a previous Institute of Medicine (IOM) report on patient safety and health information technology, researchers note. The IOM report cited "poor interface design, poor workflow and complex data interfaces" as serious threats to patient safety in a clinical environment.
"Time(s) for experts to perform meaningful, use-related tasks in existing EHRs were high," researchers wrote in Better EHR. "These times are predictive of errors in routine performance and would likely be higher in actual clinical practice. Users face numerous usability problems as they use systems in real-world clinical practice. Poor usability is a critical challenge limiting the adoption and safe use of EHRs."
Dr. Walji notes that by the end of the RUA process, researchers gave participating vendors a list of usability problems and benchmark times by which they could measure and improve their own systems.
"All the systems had some violations of good design," said Amy Franklin, PhD, an assistant professor at UTHealth and a project researcher. "So making sure that usability is a priority, usability is part of the workflow in the development, the creation, and maintenance of the system along the design of new components, is [something vendors] could all do."
What Vendors Say
The project also included interviews with 11 EHR vendors to better understand to what degree — if any — the vendors practiced user-centered design (UCD). As the name implies, in a UCD approach to designing a product or service, the end user is at the center of the design process, which results in a product that accommodates users, rather than forcing them to adapt to the system.
The research team visited 11 EHR vendors of varying size. The smallest participating vendor had about 10 employees, while three employed more than 6,000 people and generated revenues of more than $1 billion each. The research team conducted interviews with each vendor to ask about its UCD processes.
Interviewers found the vendors fell into one of three categories of UCD refinement:
- Well-developed UCD: Vendors have a refined UCD process and an extensive staff devoted to usability;
- Basic UCD: Vendors understand the importance of UCD and are working toward UCD processes but face resource constraints and employ few usability experts;
- Misconceptions of UCD: Vendors have no UCD process in place, generally misunderstand the concept of UCD, and generally have no usability experts on staff.
"What we were finding … is that many of these vendors didn't really know about how to design usable software," Dr. Walji said. "Oftentimes, they were designed by computer people. Sometimes they would get clinician feedback on it, but … that's not really enough. You really need to have usability designers in the mix, as well. That's why we said it's really a shared responsibility. We need to get all these stakeholders in place."
Midway through the project, new ONC requirements for vendor certification gave researchers hope for better development and testing on the vendor end. Federal rule revisions the Department of Health and Human Services passed in 2012 now require EHR vendors to perform a process known as safety-enhanced design to obtain certification.
Under the current requirements, vendors must use a formal UCD process during development of their EHR systems and perform usability testing with "real" users, or physicians who would use the system in a real-world setting. On top of that, the vendors have to make a report of their testing publicly available to earn certification.
Dr. Walji says the requirement for vendors to publicize their results was a positive sign for the SHARPC researchers. But once vendors started posting their reports, he said, it turned out many of the reports "weren't particularly useful, in the sense that they didn't really disclose the information they were supposed to. And the certification bodies seemed to let them get away with that."
"Some of the vendors did [the reporting] very well, and some of them had a very cursory statement that they had actually conducted this testing, and that was it," Dr. Walji said. "So I'd suggest that they probably didn't want to disclose that information and tried to do the absolute minimum without being forced to."
Houston internist Hardeep Singh, MD, briefly participated in one of the three other SHARPC subprojects outside of the EHR usability and workflow research, but became familiar with the work researchers were doing on the EHR effort. He says the ONC requirements for safety-enhanced design may need to become more stringent and hold everyone to a better standard for testing.
Dr. Singh notes that vendors mentioned they had a hard time recruiting physicians to help them perform the required testing. Dr. Singh says he could vouch for that problem and wonders how vendors could make the testing worth the doctors' time.
"Physicians are stretched from every direction," Dr. Singh said. "Now, if you want physicians to contribute to usability testing or user-centered design principles, we would need to somehow incentivize or pay them."
The Physician's Role
During the course of the project, SHARPC researchers developed a number of products and tools they believe will help willing vendors build more usable EHRs. The SHARPC website and the Better EHR book detail the fruits of the group's work.
In part, researchers developed:
"In the future, I think there really are no excuses anymore to suggest that EHRs should not be usable," Dr. Walji said. "I think we know how to make EHRs usable. It's certainly not a trivial task. But there are processes in place, and there are ways to do this, techniques to do these things. And hopefully for the vendors, they will … take this up and move this whole thing forward."
But Dr. Singh cautions SHARPC was merely a starting point.
"SHARPC may have uncovered a few things that are important, but this is a long-term sort of endeavor, and we're going to keep finding more stuff to fix. EHR use is a complex sociotechnical process, and usability extends beyond just interacting with a computer screen. I think the more we look, the more issues we're going to uncover," Dr. Singh said.
Dr. Singh agrees with Dr. Walji's assessment that physicians are part of the "shared responsibility" in building better EHR systems. Dr. Singh, a patient safety researcher at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, helped develop the Safety Assurance Factors for EHR Resilience (SAFER) guides, which ONC released in early 2014 to help health care organizations identify and mitigate EHR-related patient safety risks. (See "Playing It SAFER," June 2014 Texas Medicine, pages 35-38.)
"We need to create better bridges between the physician community, the vendor community, and the IT community in general so we can actually have better participation in this process," Dr. Singh said. "Physicians should adopt leadership roles related to the EHR, rather than just being the end users and get more empowered to say, 'OK, this is what we need to do, and this is how we will do it.'"
Joey Berlin can be reached by phone at (800) 880-1300, ext. 1393, or (512) 370-1393; by fax at (512) 370-1629; or by email.
Usability Severity Rankings
SHARPC researchers who performed the rapid usability assessment ranked the usability problems they found in each electronic health record system on a severity scale from one to four. Here are examples of each type of violation:
- One (cosmetic issue): minor typographical or spelling error.
- Two (minor violation): cluttered user interface screen that presents too much information at once.
- Three (major violation): inappropriate use of autofill functionality or defaults, such as when the system automatically enters corresponding parameters once the user enters drug information.
- Four (catastrophic violation): a display box that isn't wide enough to show the full name of a prescription medication, causing potential confusion over drugs with similar names.
September 2015 Texas Medicine Contents
Texas Medicine Main Page