Practice guide to evaluating medical app usability

While formal, professionally conducted, usability evaluations can provide in-depth product comparisons of mobile apps, they may not be a practical consideration for small- to medium-sized practices. This article, adapted from a free mHIMSS white paper available here, outlines some basic steps to assist practices in evaluating mobile app usability, based on current usability recommendations and best practices.

Whether you have downloaded a free medical app, are comparing similar apps or are thinking about purchasing an upgraded version of an app, the following evaluation process steps will assist you in selecting and adopting the most “usable” app.

1. Consider practice goals and measurable objectives
Consider what goals are most important to your practice when adopting an app, even if just for your own use, and how these goals relate to usability. For example, are you especially concerned about:

  • Improving the efficiency of prescription refill requests?
  • Effectively using patient encounter data for patient education?
  • User satisfaction with a clinical decision support tool?
  • How much training will be required for clinicians to become adept with the app? (Ease 
of learning.)

Think about simple baseline measures related to your goals. For example, efficiency is typically 
measured by how long it takes to perform sample tasks during a usability test. Satisfaction and concerns can be assessed with questions included in a post-usability test questionnaire. Ease of learning can be assessed by having users attempt the same scenario/task with a number of “test tasks” and objectively measure learning based on the number of attempts and measures such as time to completion and success rate. Use your practice goals as a baseline measure for reviewing and/or comparing the apps of interest.

Depending on the specific nature of effectiveness concerns, targeted questions may be appended to the usability test scenarios (answered after the scenario has been completed) or included in the overall questionnaire to be filled out after the full test has been completed.

2. Check other resources to supplement app market reviews
Keep in mind that many of the publicly available medical app reviews are gathered through informal, non-scientific means. Most reviews and ratings are based on user satisfaction data. Organizations are only just beginning to provide information regarding perceived efficiency and effectiveness of medical apps. Use these reviews to help formulate your questions, but do not allow the reviews to override your own assessment based on your practice goals. Hopefully in the future, a framework for assessing the usability of medical apps will be available, similar to that developed by the National Institute of Standards and Technology (NIST) for assessing the usability of EHRs.

A few resources for medical app reviews:

  • – Physician-curated and –generated reviews with indexed search functions by app type, physician specialty and device platform.
  • iPhone Medical Apps – Reviews focused on iPhone medical applications
  • Happtique – A mobile medical app store developed by healthcare professionals. Currently developing a medical mobile app certification program to verify clinical appropriateness and technical functionality based on quality and performance standards.
  • KLAS – Reviews on "mobile data systems," which KLAS describes as "products that focus on providing physicians with access to census, results and other patient information on a handheld device."
  • Mobihealthnews – News and reviews on the latest medical apps.

Another potential resource to consider is your peers. Check with professional organizations for your specialty to see what they may have to offer. Observe one or two colleagues using the app you are considering. Observe them working with the app. Have a checklist of questions or issues that have arisen so far in your usability evaluation process that relate to their own work.

Find out what challenges these users have had learning and customizing the app with the following questions:

  • How long did it take them to be fully “up to speed”?
  • What was easy or difficult to learn?
  • What workarounds have they had to develop and why?
  • What tasks do they find frustrating due to inefficiencies in the app or complicated design?
  • What kinds of errors do they find are too easy to make?
  • How does the app increase efficiencies?

3. Assess the app’s usability with typical clinical scenarios
Perform a usability test of the app(s) under consideration. This can be done using fairly simple non-scientific methods, but is a structured means of collecting valuable usability feedback.

Create a representative set of clinical scenarios that include the essential and frequent tasks that the app will support. Time permitting, include complex tasks. A complex example: Three different apps are used during a patient assessment -- a decision support app to identify differential diagnoses, an app to review the patient’s recent ECG waveform captures and a medication dosing app to help guide the care plan.

If you are testing the app with multiple providers in your practice, suggest, provide task-specific questions for the participant to answer after completing each test scenario. The questions should focus on efficiency, effectiveness, and satisfaction as well as usability concerns related to each task. The full white paper offers some sample primary care practice scenarios.

A more in-depth review of usability testing is provided in the following NIST publications: Customized Common Industry Format Template for Electronic Health Record Usability Testing (NISTIR 7742) and Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records (NISTIR 7804)

If you are testing the app for your individual practice needs and not gathering feedback from a group of providers, suggest completing the Usability Principle Attribute Checklist outlined in the full white paper.

Prepare a post-usability test questionnaire that captures the overall experience with the app. Include a simple rating scale for each question. If gathering feedback from a representative set of clinicians in your practice, schedule each of them to participate in a one-on-one hands-on test of the app after a brief demo.

Conduct the usability test. The basic elements of a simple usability test are as follows:

  • Load the app on one or more mobile devices that can be used for testing. Ideally, the participants should already be familiar with the mobile device type(s) (e.g., iPhone or Android phone) and its basic features.

  • Instruct the participants to attempt to perform the scenarios without assistance, “thinking out loud” as they go.

  • Each participant should answer scenario-specific questions as each scenario is completed, and fill out the general questionnaire at the very end.

Try to provide a test environment that closely simulates (or is) the actual usage environment and typical usage conditions.

  • Consider screen orientation. Does a certain usage scenario presume use in portrait or landscape mode? Test tasks in both modes if possible.
  • Consider how the user will hold the device. Is it likely to be used with one hand or two?
  • For any text entry, consider whether the users will one-finger type or will be likely to double-finger or double thumb-type. Will this affect performance?
  • Are there any other environmental requirements? For example, if the device is to be used in surgery, test the use with a case/cover that meets surgical standards and use with surgical gloves.

As an “observer” or usability test facilitator:

  • Record the time it takes for the participant to complete each scenario.

  • Record key comments made during the test (without interrupting the user).

  • Record whether the participant successfully completed each scenario.

  • Track common errors made by participants.

  • Provide each participant with the post-test questionnaire at the end of the test.
  • Aggregate and synthesize the observations, comments and responses.

  • If working alone, consider using low-cost video recording so you can go back and evaluate the users after the live study sessions. There are many free and low-cost software apps that let you use a webcam and a laptop to capture the sessions.

Incorporate the usability findings into your overall assessment of the app. Whatever app you choose, provide feedback to the vendor to help them improve their product. If possible, also participate in future vendor usability studies and user-centered design activities. It will give you direct input to product improvements and improve the likelihood of their success as well as your own.