rotate screen

Remote usability study

Onboarding flow

Client: The Wall Street Journal
Year: 2020
Project brief: Identify friction points in the current design. Analyse affordance, motivation and ease of use
Scope: Usability study script, data analysis, basic heuristic evaluation
Deliverables: Usability study data analysis with actionable insights that can be used to inform an experiment (A/B test) roadmap.
My role: Researcher & data analyst
Tools:, G-suite
Effort: 55h

The client

The Wall Street Journal (WSJ) is one of the largest newspapers in the United States by circulation. It's online arm, operates primarily on subscriptions and with additional revenue generated from advertising.

In recent years they have invested in optimising all parts of their customer journey - from acquisition, to on-boarding, to retention as well as the cancelation journey. This particular project focused on the onboarding flow.

client logo

The project

WSJ have recently launched an onboarding flow consisting of 15 steps. Previous analyses had shown that customers spend as little as 3 seconds on a screen and engagement with elements on pages is low. WSJ were looking to understand why that might be and how that might be improved.

Why is engagement on pages so low and what could increase engagement?

The challenge

As well as understanding engagement, the WSJ account manager had informed me that the separate WSJ teams are interested in directing readers to their respective parts of the product to engage them with content that directly affects their KPIs. So not only is this project focused on customer engagement, it is also about educating the client about customer centricity and shifting focus from output focused KPI (x% landed on my product page) to outcome focused KPI (y% of onboarded customers engage with content).

The team

For this project I worked on my own. However my work was reviewed by an outsourced Senior Researcher who had previously worked with House of Kaizen and worked on the WSJ projects.


My process

1. Analyse existing data, develop hypotheses.

As I was new to the project and unfamiliar with what may have been done in the past, I set out to understand the existing research.

I learnt that, curiously, customers were spending more time at the beginning and the end of the onboarding flow and skipping the steps in between. Upon closer inspection of the steps in the flow, I noticed that the steps at the beginning and end of the flow are where the customer inputs information about themselves. In contrast, the steps in between are invitations to download the app, read some content , listen to a podcast or similar.

With this information, I developed a hypothesis - Subscribers in the onboarding flow engage with content where they feel information provided will personalise their experience.

I also noticed that the steps subscribers spend the least time on seemed to navigate subscribers away from the onboarding flow onto other parts of the website. So with this I developed my second hypothesis - Subscribers in the onboarding flow do not want to be interrupted and want to complete the onboarding before browsing the website. This hypothesis went directly against stakeholder motivations, as outlined earlier.

Subscribers engage with content where they feel information provided will personalise their experience.

Subscribers do not want to be interrupted and want to complete the onboarding before browsing the website.

2. Recruit participants

For this step I used and utilised their panel of testers. I knew WSJ had developed Personas for their acquisition channel so I used this information to match the criteria to the audience panel.


The main criteria was that the participants must be US based, employed, in the higher salary bracket, all testers must have a strong interest in money & investing and/or business & finance news. Testers also couldn't be existing WSJ subscribers. Due to time constrains, it had been decided to limit the scope of this usability study to desktop only.

Participants were given a scenario and a set of tasks to carry out on each of the 15 steps within the onboarding flow.

Participants were asked questions and to share their thoughts and feelings throughout the journey.

3. Write the script, setup test

  • I wanted to assess affordance - do subscribers understand what they can do on each of the screens? What do they think will happen when they interact with the elements on the page?
  • I also wanted to understand motivation - is the content on the screen relevant to the mindset of onboarding? Does it appear engaging?

For a remote unmoderated usability study I used the platform to setup the test, recruit participants, export data and video highlights.

4. Analyse data, share findings

At first I coded the videos as I watched them within the platform. I then extracted the data and began to analyse it, narrowing down on the possible answers to my initial questions.


of testers expect
tailored content


to navigate to content
while onboarding


are unlikely to
complete onboarding

The results

Amongst many findings, here are some key takeaways. An interesting and somewhat surprising takeaway was that test participants all expected that the information they provide at the beginning of the onboarding flow would not only personalise the content they see on the later as they browse, but also that the onboarding flow itself would be personalised to their interests. This was not the case.

Participants also found the onboarding flow to be unnecessarily long and repetitive. By step 6, nearly 20% of participants commented that they would get distracted or bored and would exit the flow. This figure increased to 50% by step 10 and by the end of the flow, as many as 75% said that they would be unlikely to complete the flow.

As expected, the engagement with the steps in the middle of the flow, where subscribers are enticed to "Read article now" or "Watch videos", or similar, are perceived to be distracting by those who understood that clicking the CTA would exit the onboarding flow and navigate them to content pages. In addition, more than half of the participants misunderstood the CTAs and expected to be able to subscribe to the type of content for later consumption, rather than be navigated away from onboarding, if they clicked the CTAs. This shows that WSJ goals are misaligned with those of their subscribers.

I also observed issues with navigation, confusion with page layout and style inconsistencies, multiple primary CTAs confusing study participants. In addition I found that copy appeared to be confusing the usability study participants.

Next steps

These, and other findings were presented to the stakeholders and a series of AB and MVT tests were planned into the roadmap.

Key areas that were addressed:

  • Length of the onboarding flow
  • A redesign of key steps of the flow
  • Delaying other steps to later in the customer journey, and surface them when they are more relevant
  • Onboarding flow personalisation

Kristine is one of the most thorough people I've worked with in my career. She is an excellent user researcher, with a keen eye for detail and inquisitive nature. Kristine's ability to draw out actionable insights from varied sets of quantitive and qualitative data never fails to impress me. Her work has been pivotal in constructing impactful testing programmes across a range of clients I have worked on.

Senior Optimisation Manager

Tom Gale

Senior Optimisation Manager // House of Kaizen

Kristine and I worked together on projects for The Wall Street Journal. With Kristine's recommendations of how we might further optimize our client's products, we were able to gather key insight to inform our testing roadmaps and drive significant uplifts in our tests. Many of the recommendations Kristine gathered are the winning experience onsite. I'm confident Kristine will be a major asset to any team she joins.

Senior Product Manager

Trevor Cookler

Senior Product Manager // House of Kaizen