preloader image

UX Research

May I have your SSN?

Role

UX Research Lead

Start

Mar 2022

Readout

Apr 2022

Product

Mobile investment benefit app

00.-

Introduction

By the end of this case study, you'll see how I:

[1] Used mixed-methods competitive research to investigate strengths and weaknesses of mobile investment app signup processes; and

[2] Used my visual design and presentation skills to effectively communicate my research insights back to the business.

2

study phases

3

weeks from start to finish

5

competitor apps tested

56

study participants

01.-

Challenge

Project background & challenge

I led this study while specializing as a UX Researcher at a large financial services company. The UX design and product stakeholders that came to me were in the process of redesigning our employee equity benefit product’s mobile application and were hoping for answers on how to optimize the signup process in terms of launch, layout, breadth, and detail. They were particularly concerned about getting the design of this process right because signing up for investment products usually requires the user to enter a lot more information than signup processes for products of other domains — and much of this information could be considered extremely private and sensitive, such as someone’s Social Security Number (SSN).

The organization I worked at had a number of employee financial benefits platforms under its umbrella, and one of these products was an equity management platform. The platform was already popular and successful among many notable private and public companies across the globe; however, like many established financial organizations, its interface had become antiquated and cluttered with the look typical of decade-old financial webpages and was in need of a revamp.

The mobile app for the platform was a lot more modern and minimalist, as it had been developed much more recently than the web version. However, the signup and onboarding process for users of the mobile app was still lagging behind, especially because they were being forced to exit the app and complete signup using a mobile browser. As a part of the revamp of the entire platform, the PM and UX designers in charge of the mobile app redesign came to the UX Research Team with a variety of questions about the best way to launch and layout the signup process for an investment benefit app. Their primary question was around how much friction was added to mobile signup processes if the process was launched on web vs. kept completely native, but additional questions they had were related to the optimal number of steps for investment app signup, the best layout of these steps for mobile, and what general pain points should be avoided.



02.-

Stakeholder kickoffs

Project kickoff

I first organized a kickoff with the UX Design and Product stakeholders in order to iron out specific project details. Prior to officially starting each research project, I like to solidify:

[1] Research goal — Once we iron out the study’s research goal, it serves as an overarching compass for the study design, the questions asked, the data collected, the analyses run, and the final readout presentation. I usually try to stick to just one high level research goal per study, to limit the number of variables being considered at once; however, you can see in the image below that in the case of this study, we ended up with two research goals due to the number of research questions stakeholders were hoping for insights on.

[2] Administrative considerations — I have found that balancing administrative considerations for a study is almost always the determining factor in how happy and satisfied stakeholders are by the final readout. These considerations include details like timeline (especially if there is an upcoming deadline for the larger business or a feature release date to shoot for); key points of contact and subject matter experts for specific areas of the product; any other teams/projects that are dependent on the outcomes of the study; clients being affected; and so on.

Through kickoff discussions, I was able to finalize the two research goals below with the study’s stakeholders. I also determined that, although there was no official deadline attached to the design work yet, this study was situated prior to any UX Design ideation or iteration in the project timeline. This meant that it would be most advantageous to have insights ready as soon as possible, so they could be used to inform beginning UX Design iterations. As a result, we settled on a timeline of about three weeks from the start of the study design to the final research readout.

03.-

Study design

Study design

Since we ended up with two research goals, I advised the stakeholders that it would be in our best interest to separate the study into two “phases” — this would ensure that we could cover all of the variables we were interested in (launch style, number of signup requirements, step layout, etc.), but still be able to attribute study results back to specific variables with a reasonable degree of confidence. I therefore designed the study to include the following phases:

[1] Focused on Research Goal 1: Launch of signup — I chose to separate the analysis of signup launch from the rest of the analysis mainly because I felt we would need to test signup launch with live competitor mobile apps. This would ensure that we were giving study participants a completely realistic feel of “launching” a signup process, including associated interface/app changes and load times.

  1. Methodology: Unmoderated, think-aloud, between-subjects A/B usability tests of the beginning of three different live competitor mobile apps’ signup processes. I chose to include three different platforms based on findings from a brief first-person research phase, where I uncovered that there were three main signup launch styles: completely native, embedded webpage (I referred to this as “hybrid” launch style), or external webpage (I referred to this as “web” launch style).
  2. Sample: N = 36 United States residents that had used an investment app within the past year. Participants were split into three groups of 12, and diversity in ethnicity, gender, and age (comparable to the diversity of our product’s users) was ensured for each group.
  3. Study scenario: All participants were asked to interact with the first few steps of the signup process as they would in real-life (there was no “mock” scenario given).
  4. Qualitative data & analysis: Raw transcripts were captured based on the think-aloud protocol used during the test. Transcripts were then thematically coded (inductively) after data collection.
  5. Quantitative data & analysis: Task completion times and signup launch times were both recorded. Each participant was also given the full System Usability Scale questionnaire at the end of each of their sessions.

[2] Focused on Research Goal 2: Process of signup — It was important that we separate the data collection and analysis for the actual process of signup. This is because there is quite a bit of Personal Identifiable Information (PII) required for the user to enter during the full signup process for investment apps and we did not want study participants to enter their own information during the recordings. This prevented us from having participants test with live competitor apps, because they would not be able to use dummy data easily if that was the case. However, I still wanted to preserve as much realism as possible, so I created prototype versions of the competitor apps we wanted to test using Figma and utilized the Anima plugin to make the text fields actually accept keyboard entry, as they would be in real life.

  1. Methodology: Unmoderated, think-aloud, between-subjects A/B usability tests of two different competitor mobile apps’ full signup processes.
  2. Sample: N = 20 residents of United States or Canada that were employed full- or part-time, or unemployed and seeking employment. Participants were split into two groups of 10, and diversity in ethnicity, gender, and age (comparable to the diversity of our product’s users) was ensured for each group.
  3. Study scenario: All participants were provided with a story-like scenario, where the order and exact details of the signup steps could not be guessed. Signup was framed in the context of signing up for a company benefit, to most closely mirror the context of use of our own product.
  4. Qualitative data & analysis: Raw transcripts were captured based on the think-aloud protocol used during the test. Transcripts were then thematically coded (inductively) after data collection.
  5. Quantitative data & analysis: Task and scenario completion times were recorded. Each participant was also given the After-Scenario Questionnaire twice during their test; once at a “checkpoint” about midway through signup, and once at the end of signup.

04.-

Prep & launch

Prototyping, prep, & launch

Based on the study design set forth in my research plan, I then prepared and launched each study phase. Each phase was similar in that they were set up on UserTesting.com as between-subjects, unmoderated, mobile platform usability sessions with think-aloud protocols; however, there were a few distinctions between each of them:

[Phase 1] Focused on Research Goal 1: Launch of signup — For this phase, I needed to use three live versions of competitor apps, which warranted a slightly different setup flow on UserTesting.com. To help narrow down which apps we would use, I initially did some first-person research by downloading ~60 mobile investment-focused apps across iOS and Android mobile app stores and testing out their signup processes. I selected three different apps that were all alike in that they asked for minimal PII during the first few steps of their signup; however, each app had a different launch style, based on the three different styles discussed in the previous “Study Design” section (native, hybrid, and web). When setting up the tests for this phase, I used UserTesting.com’s capabilities to blur the recording during the tasks where PII was being collected (limited to email address and password). I also advised users within the test instructions not to speak their PII aloud while typing it, and to use a different password than they have used for other financial apps in the past.

[Phase 2] Focused on Research Goal 2: Process of signup — For this phase, I knew I needed to make prototype versions of the competitor apps being tested, so we could accept dummy data in place of sensitive PII requested during investment app signup, like someone’s Social Security Number (SSN). However, I also knew that we needed to keep the look and feel of the apps as realistic as possible, so we could get more accurate and confident results along dimensions like task completion times. As a result, I created high fidelity prototypes of the signup processes for two different competitor apps using Figma. I then utilized the Anima plugin to create “type-able” text fields that worked even when the prototype was opened on a mobile device. The two apps that I modeled my prototypes after were selected based on the first-person research mentioned previously; This time around, however, I was looking for two apps that had the same number and nature of signup requirements, but that used two completely different layout approaches and UI styles (I referred to these two different apps as “minimalist” vs. “traditional”). You can see a couple of animated examples of how the ending two prototypes looked to users completing the test below.

For both study phases, I ran a pilot test for each respective competitor app being tested, just to make sure the study design itself was clear, balanced, concise and user-friendly, and that we were not asking for too much or too little from participants.

"Minimalist" signup process — Prototype excerpt
"Traditional" signup process — Prototype excerpt

05.-

Analysis

Qualitative & quantitative analysis

Prior to the readout, I performed qualitative and quantitative analysis on the data collected for each phase of the study. Qualitative analysis involved inductive thematic coding of study participants’ question answers and think-aloud comments. Quantitative analysis involved running nonparametric inferential statistics tests (mainly Kruskal-Wallis, Wilcoxon rank sum, and effect size tests) in RStudio using the R programming language. Nonparametric tests were used to compare task completion times, launch times, scenario completion times, SUS scores, and ASQ scores between groups in order to find significant differences between competitor apps.

06.-

Insights

Communicating insights back to the business

I distilled my study design, process, analysis, and main insights into a slide deck and brought all relevant Product, UX, and interested dev stakeholders together for a readout. I additionally created a “Cheat Sheet” for the main insights and study results, which can be found below. This type of document had never been created before by a member of my UX Research Team, but I felt it was warranted due to the relatively complex setup and nature of the study. Additionally, I created it so that it could be easily read and referenced by stakeholders while working on future designs (a much more practical alternative to them having to flip through a 30-slide research readout PowerPoint).

I presented four main insights to key stakeholders based on the study:

  1. [1] When in a mobile app, forcing signup to launch in a separate browser app is an unideal UX; however, the difference between the perceived experience for hybrid (in-app browser) vs. fully native signup launch is comparably small.
  2. [2] When it comes to financial app + company benefit signup, participants expect a significant amount of information to be requested and seem to be open to a slightly longer or more complicated process – as long as help is included as needed and they are assured their information is safe.
  3. [3] Less steps on the page is the generally preferred layout, but this should be balanced by accessible help information for those that are less experienced.
  4. [4] An approach like Tornado’s (Minimalist) appeared to be the most equitable option in terms of ease of use for all age groups.

You can read about some of the specifics and main findings behind these insights and more within the Cheat Sheet below!

07.-

Impact

Impact & conclusion

It is hard to report on measured impact for this study because much of the impact of UX research that occurs so early in the design process manifests itself in concepts like risk avoided or user friction that never makes it live. However, the execution of this research project undoubtedly increased stakeholder confidence in the direction they wanted to take the mobile signup designs in; key takeaways on this front included the discovery that native and hybrid (embedded web) signup launch styles were seen as almost equally usable (but external web launch was to be avoided). Additionally, the finding that less signup requirements on a mobile page (the minimalist approach) was generally more well-received and made a significant, positive difference in usability for our 44+ year-old users in particular really set the tone for the UX design approach. Findings such as users’ openness and understanding when providing sensitive personal and financial information while signing up for an investment benefit eased the entire team’s anxieties around the relatively high amount of information that we request from our users. Finally, the new strategy of creating a Cheat Sheet for studies that are more complex in nature made our UX research insights more accessible, engaging, and widespread across the organization.