rotate screen

Design evaluation + AB test result analysis

Donation flow

Client: Shelter charity
Year: 2019
Project brief: Analyse results of an A/B test and help inform the further A/B testing roadmap
Scope: Heatmap analysis, video recording analysis for desktop
Deliverables: Presentation summarising findings and highlighting areas of focus for next iterations of A/B tests
My role: Researcher & data analyst
Tools: Hotjar, Sessioncam, G-suite
Effort: 24h

The client

Shelter is a registered charity that campaigns to end homelessness and gives advice, information and advocacy to people in need in England and Scotland. It raises funds through donations and a large proportion of these comes in via their website. Shelter has been working with the agency I worked for to improve their website to optimise conversion rates.

client logo

The project

Shelter had recently redesigned their donation page based on data collected in past research projects. This redesign was then tested against the control and whilst all the previous research suggested it should perform well, results were underwhelming and the optimisation managers and stakeholders wanted to understand why this may have happened.

The challenge

Multiple elements had been changed and tested all at once. Test results were therefore hard to interpret as there were many variables to consider. I was tasked to analyse the data and help the team understand which of the redesign aspects worked and which might need further iteration.

The team

For this project I worked on my own. My work was reviewed by an outsourced Senior Researcher who had previously worked with House of Kaizen and was familiar with past Shelter projects.


My process

1. Extract and analyse heat maps.

I extracted Scroll maps to understand how far visitors to the pages scrolled and compared this data with the control. This allowed me to identify any content or elements that became harder to discover.

I then extracted and analysed Mouse move maps to understand how the content of the page was being perceived. The main questions I wanted to answer at this stage were around what visitors to the page might be doing while on the page. Are they reading the copy? I wanted to understand if this copy is being consumed and how that might compare to the control. Are visitors hovering over certain elements? Is something unclear? Where might their attention be directed?

Having analysed scroll depth and mouse movements, I proceeded to look at click map. I was looking to see if mouse movements translate to clicks. I wanted to understand if visitors have some anxiety around suggested donation amounts? I was also looking to understand if more information might be desired. I then compared this data to the same data about the control and compared my findings.

2. Watch session recordings

I watched recordings of sessions to the donation page. I was looking to understand if there might be friction on the page, any usability issues or bugs that may have contaminated the test results or unfairly disadvantaged the variation.

I made sure to include an equally distributed number of recordings on all dominant browsers. I recalled that the test performed particularly poorly on IE, so my hypothesis was that there may be browser compatibility issues and I wanted to investigate this, as given Shelter's demographic this would affect a larger than usual proportion of visitors.

I was particularly looking for lost clicks and rage clicks, any broken CTAs. I was also on a lookout for page refreshes, rendering issues and any problems scrolling.

I compiled data from this step by recording occurrences for each type of issue for each recording I observed.

The results

Amongst many findings, here are some key takeaways

From my analysis I was able to conclude that the redesign had addressed some issues highlighted in earlier research. For example, visitors to the variation scrolled deeper than to the control. Equally, I observed that the new tab redesign in the donation module of the page has fewer clicks on the preselected tab, which was an issue for the control. The variation had resolved this issue.

Lost clicks on a section which described how donations were used indicated that visitors might want more information here before committing to donate.

I did, however, notice that changing position of the image had resulted in a decreased engagement with the copy. Shelter being a charity try to put forward emotional copy to compel visitors to donate and a decrease of engagement with the emotional copy would lead to a reduced number of donations as well as a smaller average donation if visitors do not connect to the case studies.

discovering issues

Variation has improved CTA discoverability which lead to increased number of donations from the bottom of the page

emotional connection

Visitors to the variation are less emotionally connected which may lead to fewer and smaller donations


Internet Explorer experienced a significant number of issues

Next steps

As it was later discovered Internet Explorer was not included in the pre-launch tests. As a direct result of this investigation this was changed for this client's accounts as a very large proportion of their users use IE as well as older versions of other browsers. This lead to re-evaluation of what tests might need to be included in future projects. Internet Explorer test results were excluded when evaluation the performance of this A/B test as the results were contaminated.

Further iterations on the image and copy were recommended. In addition more information on how donations are spent, and how they might contribute to the well-being of those in need was added to a later iteration of a test.

Kristine and I worked together on projects for The Wall Street Journal. With Kristine's recommendations of how we might further optimize our client's products, we were able to gather key insight to inform our testing roadmaps and drive significant uplifts in our tests. Many of the recommendations Kristine gathered are the winning experience onsite. I'm confident Kristine will be a major asset to any team she joins.

Senior Product Manager

Trevor Cookler

Senior Product Manager // House of Kaizen

Kristine is one of the most thorough people I've worked with in my career. She is an excellent user researcher, with a keen eye for detail and inquisitive nature. Kristine's ability to draw out actionable insights from varied sets of quantitive and qualitative data never fails to impress me. Her work has been pivotal in constructing impactful testing programmes across a range of clients I have worked on.

Senior Optimisation Manager

Tom Gale

Senior Optimisation Manager // House of Kaizen