Develop an EMR/EHR : Simple Interfaces for Complex Functionalities

Develop an EMR/EHR : Simple Interfaces for Complex Functionalities

Posted December 4, 2012 by admin
EMR/EHR Software Developmet

Complexities of EMR/EHR

EMRs are complex systems.  They are used by time-pressured, frequently interrupted users, many of whose workflows are (appropriately) too distinct for effective automation.

Their goal is to convert medical records from paper charts to digital electronic charts and also to enhance the flow of information about patients and their care, to all who might be involved in the patient’s care.

  The physician’s work flow in an office practice is very different that the work flow in a hospital.  Therefore one size EMR does not fit completely in our present environment.  The issue of trust in handling records between primary stakeholders and facilitator stakeholders also represents a barrier to adaption.

Most modern day EMR solutions are capable of performing complex functionalities like One-Click Patient Search, Data Concurrency, Electronic Prescription, Labs Integration using HL7 interface, Claims Processing using EDI transactions, Appointment Scheduling, Medical Billing,  etc.

Simple Interfaces for Complex Functionalities

Physicians want their EMRs to be fast, accurate and easily customizable. They want them designed to be straightforward to implement, learned quickly, and simple to use with intuitive, preference-based workflows.

Our discussion here is not focused on typical issues such as text size, color choice, or menu location. Guidelines for such characteristics are readily available.  Rather it centers on the importance of interacting with EMR systems and entering accurate information in easy and usable way. The system must make accurate data entry as simple as possible at the point of patient service. It is critical to realize the benefits of EMR systems. Therefore any discussion about EMR user interface (UI) considerations must include a determination of the target device (desktop, laptop, touch screen, tablet, etc) based on the user role, the preferences of the users, the task they perform, and their work environment.

One can consider an interaction with an EMR system in the following dimensions:

1. User-Role – The application should identify the type of user after a login (or some other authentication such as thumbprint).  The system should present the user with a screen or a dashboard that is specific to the user role (This is sometimes called user perspectives). For example, if a nurse logs-in to the application, the user interface should display functionality that is specific to that role. This creates a simpler, more focused UI. Different UIs based on user role is an advantage.  For example, a physician may perform a variety of sub-tasks with the patient (record family history, write a prescription, etc) making a more fully featured user interface more applicable.

For nurses taking vital signs, a simpler UI might make more sense.  Users may have multiple roles, and they should be able to set their default screen to the role that they prefer, and also be able to change to other user perspectives as well (if they have the permission to do so).  If a device is located in an area, such that identification requirements would be difficult, the screen should be easily toggled between role types.  Login credentials may be necessary if the user navigates to areas of the application that require authorization for advanced functionality.

2. User Preferences – It is likely you may be in an environment where different physicians will prefer different types of input modes such as dictation, typing, or handwriting.  The device and user interface must fully support each mode of data input.  For example, a doctor who can type and feels that it is important to sit down with a patient during an exam, might feel most comfortable using a laptop or a tablet PC in the exam room.  If the physician prefers taking notes, the device and interface should support handwriting recognition.  In this case, a tablet pc might be used and a significant part of the screen should be devoted to the handwriting recognition functionality.  For a physician who prefers dictation, the user interface and device should fully support capturing an audio file that becomes associated with the patient visit.  In addition, some individuals are getting more comfortable with navigating touch screens because of the proliferation of tablet devices available in market today. In this type of “mixed preference” environment, the user interface must facilitate switching between these various input modalities and present a UI that focuses on the particular input mode in use.

3. Tasks – There may be different activities involved in the patient care environment. Different devices may be better suited for different tasks.  The associated UIs should facilitate the completion of these tasks.  For example, let’s explore the scenario where a nurse records patient vitals in an exam room.  The nurse is typically standing and taking a pulse, obtaining a blood pressure reading, or recording a temperature.  A device such as a tablet PC or a touch screen would be best suited to record vitals in an EMR system.  The interface on these devices should fully support a more simplistic interface where patient data can be quickly and easily entered.  A laptop in this scenario would be significantly cumbersome.  A PDA might have a UI that is too cramped for good usability. The best bet would probably be a touch screen that can swing off the wall and allow the nurse to interact with the system using only one hand.  The touch screen UI must facilitate the top most common tasks such as searching for a patient record and then entering the vitals. A touch screen tablet PC would be a second choice, but the nurse would actually have to hold the tablet with a least one hand.  The UI should have all the interface controls that are specific to touch screens (not mouse and keyboard!) such as the finger swipe to scroll up and down and lightly tapping the screen to select a menu item.  In another scenario with a doctor in the exam room, the device of choice would correspond to the doctor’s preference.  If they prefer to type, a laptop or dedicated exam room computer would work fine.  The user interface should facilitate common tasks such searching for a patient, looking at an overview of the patient’s history, and entering in new information via mouse and keyboard.  The UI in this scenario would obviously be more feature rich that the scenario involving the nurse and vitals sings.

4. Work Environment – Different environments can include the exam room, doctor’s office, front office, phlebotomy room, etc.  Again, start with the best device for that situation and design the UI to be specific for the role and task. Laptops may be the best fit where end users are typically sitting down with the patient across from them.  Tablet PCs may be good for users who frequently travel between several different environments but want to use the same device in each.  Touch screen monitors are good for exam rooms where many different types of users need to use the system, but the user does not want to carry around laptops or tablet PCs.  The mouse and keyboard are difficult devices for people to operate in highly mobile situations (ie. a doctors’ office) so the devices and UIs should provide freedom from mice and keyboards when appropriate.


Speaking more specifically about the user interface, simple is best.  A plethora of functionality crammed onto each screen will confuse all users, partly because of the 80-20 rule.  Users may just end up using 20% of the functionality and spend 80% of their time trying to figure them out.  However, the interface should be designed so that advanced users can easily access extended functionality.  It is also important that UI patterns should be developed for each device (assuming there may be some overlap) and all UI patterns should follow the basic principles of usability.

In Summary, the system must pick the best device for the specific task and environment, and then display a user interface that is both role and task specific.  If an EMR system has a “one size fits all” non-flexible user interface based solely on one criteria, role or task or device, that’s a very strong indication that usability has not been considered in the design of the system.  If the system has been designed for a generic user and for one type of device (e.g. laptop) then it’s time to move on to the next vendor quickly!


Chetu, Inc. does not affect the opinion of this article. Any mention of specific names for software, companies or individuals does not constitute an endorsement from either party unless otherwise specified. All case studies were written with the full cooperation, knowledge and participation of the individuals mentioned.

Chetu’s Healthcare team implements solutions for the Healthcare industry. Chetu differentiates itself in providing industry specific expertise combined with its low cost, high productivity model. You can find more information about the portfolio of our Healthcare experience at:

Chetu was incorporated in 2000 and is headquartered in Florida. We deliver World-Class Software Development Solutions serving entrepreneurs to Fortune 500 clients. Our services include process and systems design, package implementation, custom development, business intelligence and reporting, systems integration, as well as testing, maintenance and support.Chetu’s expertise spans across the entire IT spectrum.

- See more at:

Connect With Us

Ft. Lauderdale | Amsterdam | Chicago | Dallas | Nashville | Las Vegas | Tampa | Baltimore | Madison | Atlanta | Houston | New Delhi

Copyright © 2000- Chetu Inc. All Rights Reserved.

Let Us Contact you
Contact Us