preloader image

UX Design, UI Design, UX Research

Student loan match

Role

UX Design Lead

Start

Jan 2021

Launch

September 2021

Product

Student loan employer benefits platform

00.-

Introduction

By the end of this case study, you’ll see how we implemented an interface that supported a total of 6,234 new program participants and disbursed a total of $9.8 million in employer matching contributions to employees’ student loans in 2021. In 2022, those projections are up to $16.7 million in employer matching contributions.

10,000

new eligible users

6,234

program participants in 2021

$9.8 mil

employer matching contributions in 2021 (USD)

$16.7 mil

projected employer matching contributions in 2022 (USD)

01.-

Challenge

Project background & challenge

This project was born out of a request from our student loan benefit product’s largest, most influential client to date, who promised us a pool of about 10,000 eligible users.

The financial benefits technology that I was working on at the time provided a platform for organizations to make direct contributions to their employees’ student loans. This kind of employee benefit, although currently less common than benefits such as tuition reimbursement, has been growing in popularity and stands to give employers a competitive edge, all while serving an unmet need within the American workforce. In the United States, the average amount of student loan debt per borrower was $28,950 in 2022 (Forbes, 2022).

Our platform had both participant-facing and administrator-facing interfaces, which allowed both user types to customize and track details that were important to them. We had already built out a number of program customization options such as custom amounts, custom payout schedules, and the ability to include factors like employee tenure in the benefit payout structure. However, this client was requesting a program type that allowed them to match their employees' contributions toward their student loans, up to a customizable cap. I was placed in charge of the UX design of the participant-side interface.

The primary challenge would be to design, develop and release an entirely new program type for our benefits platform that involved multiple complex processes and communications we had never had to deal with before, such as submission and verification of student loan payment documentation in order for amounts to be matched. Beyond this primary challenge, the client also wanted us to implement a variety of customizations to fit their needs for the program, including two different payout structures, custom messaging and announcements, and more. The solutions for all of these requests needed to be launched by September of 2021, which left me and the team about 8 months to plan, design, user test, iterate, receive compliance approval, implement, QA test, and release the features. As always, my main focus as a UX professional was ensuring that the users’ experience was as optimized as possible, but we also had to balance these considerations with client desires and shifting customization requests.

Read on to find out how we were able to implement a top-notch solution on time, and why, in the client's own words, it was "flawlessly executed"!

02.-

Stakeholder kickoffs

Project kickoff & dev collaboration

We first held a kickoff meeting with product, development, and operations stakeholders where we discussed requirements for MVP, feasibility, resourcing, and project timeline at a high level. At this stage, the developers and I were most concerned about the overall backend infrastructure of this new program, because we knew it was our most disruptive and complex product configuration to date. Most notably, we knew the success of this feature was going to rely heavily on our home-grown proof-of-payment submission and validation system; until we were all on the same page about how that was going to work, we wouldn’t be getting very far.

Based on stakeholder discussions, I created a flowchart to visualize the movement and validation of information and to get us thinking about the many states and user paths we would need to address. I held collaborative flowcharting and brainstorming sessions to finalize this chart, map out a final information architecture, and verify with the developers that this architecture was both feasible and desirable for MVP. We referenced this shared resource throughout the project, keeping us all on the same page.

03.-

Design

Design iterations, round 1

I created a few different design iterations based on this information architecture combined with competitive research, my UX expertise, and requirements from Product. Based on competitive research, I could not find another platform that was offering a matching program structured like this for student loan contributions; the most comparable experience I could find was employer donation matching, tuition reimbursement, and employee expense platforms. Despite the lack of fully comparable competitor interfaces, there were aspects of other financial platforms I uncovered that could be used for inspiration, such as components for tracking financial balances up to a cap (credit card limits, health insurance deductible limits, 401(k) tracking, etc.). Although I used aspects of this competitive research to inspire some of the initial design iterations, the lack of competitors meant I had a generally blank canvas to go off of, which both challenged and excited me.

I held a collaborative review session of my first round of designs with the developers to address feasibility and to gather any considerations I may have missed. My usual technique at this stage is to present 2 to 3 UX approaches to the developers ranging from lowest development effort to highest development effort. This allows me to get a pulse-check on which solutions are realistic for us to pursue, and it also gives the developers a voice in the process.

You can see some of the aspects of these initial iterations below. (Normally, I prefer to go lower-fidelity for first-round design iterations, but due to the relatively tight timeline and the ease-of-use of the product’s design system that I had built, I chose to go higher-fidelity with the screens from the start.)

04.-

Research

User research, round 1

The next step was to conduct user research to determine which iterations were most usable and identify any pain points and areas for improvement. Through discussions with developers and peer reviews with product and design stakeholders, I had narrowed down the designs to two main approaches; the proof-of-payment correction processes were different in look and structure between approach A vs. approach B.

Since we had two slightly different UX approaches on our hands, I created two separate prototypes and utilized a between-subjects A/B-test study model. I organized, planned, and moderated 11 think-aloud usability sessions with users of our existing student loan contribution product; 5 of these users were assigned to prototype A, and 6 were assigned to prototype B. I defined the two main goals of the study as: [1] Determine how usable the two approaches to the UX were, and [2] Generate data to help the team decide what needs to be included in MVP vs. what can wait (and for how long).

Participants across both groups were given the same overall scenario and tasks: [1] Upload your first payment document for employer matching; [2] Correct an invalid payment within a past submission; [3] Correct an invalid document within a past submission. These tasks were modeled off of the main actions participants could be expected to complete throughout a student loan contribution matching program – both the happy path and unhappy paths. Using these tasks for the tests also allowed us to hone in on the UX of a capability that we felt would set us apart from existing platforms with similar expense-matching functionalities: The ability to correct submissions on either an individual payment or overall document level. Additionally, participants were given a real student loan proof-of-payment document that I populated with mock payment data during the session to help them complete the tasks as they would in real life.

Due to the think-aloud nature of the usability sessions, I was able to collect rich qualitative data in the form of participant quotes and themes. I also collected three main quantitative metrics for each task:

  1. [1] Task completion time, in seconds
  2. [2] Number of errors (mis-clicks)
  3. [3] Number of times external help was needed

Additionally, I collected after-scenario Likert scale responses to question 4 of the System Usability Scale, which I reworded slightly for our study: “On a scale of 1 to 5, where 1 is not at all confident and 5 is very confident, how confident are you that you could do this process again on your own?” I chose to include this specific question from the SUS because it addressed stakeholders’ largest concern around “Match” programs prompting an unmanageable influx of new support calls and help messages.

I performed both qualitative thematic analysis and quantitative statistical analysis (summary statistics in Excel and nonparametric inferential statistics using R in RStudio). I also put together a presentation for dev and product stakeholders where I shared my findings, insights, and suggestions for changes.

05.-

Design

Design iterations, round 2

Based on the user research, I highlighted several main findings and insights:


  1. [1] Requiring manual entry of payment information made user error inevitable. We didn’t have the time or resources to automate the entry of payment information for MVP, but we also didn’t have the time or resources for our operations team to manually handle a flood of user-submitted errors. Solution: We would need to make sure we had foolproof error handling and extremely clear feedback messaging in the case of any user error imaginable.

  2. [2] Users skimmed copy even more than I had expected them to. This led to participants missing important details about the benefits program that would have prevented them from being able to maximize it or manage it effectively in a real-life scenario. Solution: Use text and visual decoration more to highlight the most important benefits program details and the most helpful interface capabilities.

  3. [3] The inclusion of relevant contextual information that persists with the user can make the submission-correction process much more efficient and enjoyable. The lack of these contextual details can, on the other hand, lead to an increased cognitive load on the user and more frustrating and anger-inducing interactions. Solution: Make sure important details, such as the original reason for invalidation and reference points such as payment dates, persist throughout the user’s error-correction process so they do not need to shoulder the entire cognitive load themselves.

  4. [4] Although payment-level and document-level resubmissions were treated differently on our backend, it was a smoother experience for study participants when the correction process was as consistent as possible on their front end. Solution: While it was important for users to be able to figure out the difference between each level of resubmission when starting the submission-correction process, the actual meat of the process should keep things as familiar as possible to them.

I also abstracted the insights learned from this study into 3 main lessons so we might use them for UX design decision-making in the future:


  1. [1] Users should not have to “memorize” information that was originally correct and re-enter it.
  2. [2] Users should not have to exit a flow and lose their progress to check on information necessary for completing their task.
  3. [3] We should make every effort possible to limit the chance of invalid or incorrect data ever making it into a validation queue; instead, we should focus on helping users correct errors before they submit.

You can find some examples of improvements made to the designs based on these insights below.

06.-

Handoff & launch

Dev handoff and project launch

I participated in some final rounds of UX designer peer review and finalized desktop, tablet, and mobile designs. At this point, my main focus was prepping the final designs for developer handoff. I broke my designs down into Jira-ticket-sized-chunks and labeled them by ticket number, adding contextual information, explanations, and specs wherever needed. I had been designing for tablet and mobile breakpoints at the same time as the desktop designs, but I made sure to fill in any gaps and to clearly call out any differences between desktop vs. smaller screen sizes, whether it be larger differences in component structure or “micro” differences in animation style. File paths to the CSS were already attached to pre-existing components via the design system; any net new components were marked so the devs knew to add them to the codebase.

We were able to launch the MVP on schedule, and the client was extremely pleased with the launch. Even with a tight timeframe, we were still able to include features that differentiated us from existing platforms, such as the ability for participants to edit and resubmit information at both the payment and document level; clear feedback and next steps for every user state the participant may encounter; auto-populated "recommended documentation" messaging based on the loan servicer selected; optimized animations with screen size in mind; the ability to undo when a user makes a mistake; and much more!

07.-

Research

Usability testing, round 2

After the successful launch of MVP, I ran an additional round of usability testing that was more evaluative than the first, and focused on the higher-complexity, quarterly payout structure the client had requested for January of 2022 (vs. the end-of-year payout structure for the 2021 program). For this round of research, I utilized UserTesting.com to run unmoderated usability sessions with 50 full-time working Americans that currently had student loans. Because of the relatively large sample size, I mainly used quantitative analysis of task completion times, error rates, and Likert scale questions to evaluate how usable the quarterly program’s interface was. Fun fact: A 99% confidence interval was used to determine significance of results for the inferential stats tests, which is the highest interval I’ve ever used in both business and academic contexts!

This round of testing allowed us to really hone in on where the areas of confusion would be for the quarterly student loan contribution matching program and identify specific solutions, all the way down to wording updates. The main outputs of this study were: [1] The addition of a notification center as being a SIGNIFICANT improvement in user understanding and successful participation in the program, and [2] The identification of a few key places where added copy and visuals could really clarify the participant experience.

08.-

Impact

Impact & conclusion

The launch of this new program type and its user-friendly experience has already proven to be a great boon for both the business and the program participants. Managing stakeholders on our team even described the project internally as the "largest, most well-tested release we've ever done"! We had a total of 6,234 program participants in 2021 and disbursed a total of $9.8 million in employer matching contributions to employees’ student loans. In 2022, those projections are up to $16.7 million in matching contributions. As I mentioned before, the client described the launch as “flawlessly executed”, and it has led to a happy and healthy continuing partnership. Additionally, the notification center feature was launched this year and has provided us an extremely effective way to send proof-of-payment submission reminders and general announcements and updates to users of all of our program configurations!