MediLinker: Healthcare 2.0

Switching to a new doctor can be a bit of a headache (no pun intended) - what with so much change, not to mention the thick stack of new-patient paperwork. MediLinker is a healthcare application that aims to solve the issue of fragmented medical information, while also providing patients with identity management and automated health records. At its heart, MediLinker grants patients autonomy over their own data by enabling them to create digital versions of their identity credentials and medical information and then authorizing said data to be used by selected clinics (say goodbye to carrying around, and potentially losing, sensitive documentation!) Additionally, MediLinker runs on blockchain technology to better secure patient information.

Impact: This initiative yielded a significant 30% improvement in user task completion rates.

The Basics

  • The Client

    Dell Medical School at The University of Texas at Austin

  • Role

    UX Researcher

    UX Researcher

    UX Researcher

    UX Researcher

    UX Researcher

  • The Team

    Ashrai Jha

    David Kerr

    Dishanth Shankar Reddy

    Patrick Sui

    Tara Rastogi

  • Skills

    Heuristic Evaluation

    Affinity Mapping

    Competitive Analysis

    Qualitative Research

    Test Documentation (Screeners, Moderator Scripts, & Participant Packets)

    User Interviews / Usability Testing

    Interview Moderating & Notetaking

    Data Analysis

    Presentations

Project Overview

Client Requirements:

When our team connected with the client, Dell Medical School (Dell Med), their application - MediLinker - lacked any semblance of UX research or design. Their development team turned to us for assistance with the app’s UX design, as determined by extensive user research and usability testing. The Dell Med team also wanted our recommendations on the following areas:

  • The digital wallet angle of MediLinker

  • How blockchain terminology might impact user trust

  • How the app can address current and future needs

  • How to make design elements more intuitive for both patients and clinics

Methodology:

  • We started with the client kickoff to get a general sense of MediLinker, as well as the key areas of focus.

  • We conducted a heuristic evaluation on the low fidelity (lo fi) version of the application and sectioned them off into user-focused scenarios.

  • We performed a competitive analysis on other applications in the market to learn more about the gap(s) MediLinker might fill.

  • The heuristic evaluation and competitive analysis shed light on several key points, which we shared with the development team - who used our recommendations to create a high fidelity (hi fi) version of the app.

  • With these new screens in hand, we created a prototype for our impending usability testing.

  • In order to recruit participants, we wrote up a screener based on MediLinker’s key audience criteria, as well as a moderator script and participant packets for the actual test sessions.

  • Following testing, we analyzed the data we had collected and then used these insights to present a finalized set of recommendations to the Dell Med team.

Client Kickoff

The project began with the client kickoff meeting, in which the dev team walked us through the MediLinker app. They pointed out various functionalities and highlighted specific concerns they wanted us to prioritize, in addition to mentioning their vision for MediLinker in the future.

Following the meeting, the dev team sent over the lo fi version of the MediLinker app, which we organized into user-oriented scenarios, some of which are pictured below.

Lo fi screens for scenario (1) - the onboarding process

Lo fi screens for scenario (2) - first enrollment at a clinic

Lo fi screens for scenario (3) - sharing data between clinics

Heuristic Evaluation

After organizing the screens into task-based scenarios, we conducted a heuristic evaluation, based on Jakob Nielsen’s 10 Usability Heuristics for User Interface Design.

We combed through each scenario, commenting on heuristic violations, along with our recommendations, and categorized them with severity ratings.

We sorted the heuristics for each scenario, according to Jakob Nielsen’s 10 Usability Heuristics:

Competitive Analysis

We wanted to further understand the market space that MediLinker was looking to fill, in addition to its competition. We researched and analyzed several companies before dividing them into two groups: direct and indirect competitors.

You can view the behind-the-scenes of this research and analysis HERE.

Portfolios can be beautiful and informative, but behind all the color schemes and clean lines are hundreds of screen shots, scattered post-its, and a lot of mismatch - the real work, one might say :)

We compared and contrasted feature functionalities and heuristics between each competitor and MediLinker and put together a matrix for each. The feature matrix focused on general information about each app, in addition to their features, market placement, pros/cons, and similarities to MediLinker. The heuristic matrix, however, focused on the heuristics of each competitor, which we analyzed for inspiration for MediLinker’s common heuristic violations.

Feature matrix:

Heuristic matrix:

Midpoint Check-In

We presented our preliminary findings and recommendations to the dev team, based on the three most frequently occuring heuristic violations:

Per our report, the Dell Medical team revised the initial app mock-up and created the next iteration of screens, some of which are pictured here:

The complete midpoint report, containing our heuristic evaluation and competitive analysis, is available HERE.

Usability Testing

Participant Recruitment:

With the iterated screens in hand, we were just about ready to conduct usability tests. The purpose of usability testing was to test the quality of task flows, how intuitive the app and completing tasks was, where difficulties lay, and general user sentiment toward MediLinker.

To get this process started, we recruited participants, according to the participant screener we developed. This screener enabled us to find participants who fit the demographics we were looking for, such as age, tech savviness, number of yearly clinical visits, and blockchain knowledge.

View participant screener HERE.

Our final participant demographic breakdown is pictured below:

Test Design:

With our participants recruited and screens ready as a testable prototype, we created both a moderator script and participant packet for the actual usability tests. The moderator script provided an overview of MediLinker, a pre-test questionnaire, the participant tasks, and a post-test questionnaire. The participant packet, however, was developed for the participants to refer to throughout the test sessions. This packet consisted of the required tasks they would complete with the prototype, a Likert scale to score each task and the app overall, and space for other feedback.

See here for moderator script and participant packet.

Findings / Analysis / Recommendations

Overall Findings:

System Usability Scale (SUS) Scores spanned across the entire range, indicating that users had extremely varied feelings about MediLinker.

The data separation stemmed largely from the age of the participants. We found that scores for older users (60+) tended to be lower than those of younger users (under 40), which seemed to suggest a generational divide in the way participants understood the app’s UI.

Our third task had a 33% failure rate that seemed to impact the subsequent fourth and fifth tasks, indicating critical issues with MediLinker’s UI.

Analysis:

Along with analyzing trends pertaining to the app as a whole, we also looked into each task. To focus more granularly, we created an affinity diagram by task to identify key issues.

The affinity diagram allowed us to identify issues that came up repeatedly:

Recommendations:

Based on our analysis, we recommended the following to the dev team:

Key Takeaways

Live user testing:

Each of our participants reacted differently to the MediLinker app. Being present for these interviews allowed us to observe their behavior, ask direct questions, and understand their thoughts - in addition to being able to find meaningful patterns.

Communication is key:

Our five-person team worked together for several weeks, often dividing up tasks and later combining information. Due to the nature of this project, with many moving pieces, it was critical that we communicated well (and even labeled things clearly in Figma) to keep things running smoothly.

Moderating is an art:

It became apparent to me right away that moderating a user test is much like juggling - one must keep the user on track without providing too much information, efficiently manage time without rushing the participant, and be able to handle unexpected situations while taking thorough-enough notes to later analyze. This project provided great experience, and I am relieved to know that moderating simply takes practice, which I look forward to diving into!

Final Report