Zilch Acquisition Journey Optimisation

Zilch is a BNPL company that offers customers subsidised credit with no interest or late fees. Think of Klarna but being able to use it wherever you want, in-store or online while being able to get up to 5% back in rewards as well.






Zilch has over 5 million customers on the native mobile app and has become profitable after just 4 years. When I joined, Zilch wanted to continue the rapid growth and acquire more quality leads while educating new users on how to best utilize the app.

Contribution

UI/UX Design

A/B testing

User Research

Design System

Timeframe

1 yr, 9mo

The problem

Zilch's acquisition funnel was hemorrhaging potential customers and revenue. 30% of users failed at account creation, only 65% made their first purchase, and the inconsistent UI design undermined trust in our platform. For a credit lending and cashback platform serving 5M+ customers, this represented massive revenue loss and a fundamentally broken first impression.

The constraints

  • Focusing initially on the top of the funnel meant we were stuffing users into a broken journey, while ignoring the main funnel issues

  • Gaps in event tagging in our onboarding flow meant inconsistent and sometimes unreliable data sets

  • Multiple shifting business priorities impacted our development cadence and some of my teams initiatives

My impact and key outcomes

I lead design initiatives for customer onboarding, working closely with my Product Manager, data, customer service and engineering teams. My focus was to improve the sign up flow and onboarding education while ensuring our solutions align with broader business objectives and company goals.

Key metrics I helped improve

  • +24% relative improvement in overall funnel conversion (13.2% -> 16.4%)

  • -11.4% TTC (time to completion) due to increased clarity and decreased friction

  • -72.5% login friction - reduced from 4 attempts to 1 per 30-day period

Business impact

  • +£3.6M lifetime value increase from core conversion improvements

  • +£1.056M in net profit from a single A/B test (limit display optimization)

  • increased the lead to first-spend rate from 65% to 79% within 10 days post ID verified

Design Leadership Impact

  • Lead the development of the new UI24 Design System and reskinned onboarding

  • Established data-driven design process with custom MixPanel dashboard for ongoing optimization

  • Led comprehensive UX research across multiple data and user touch points

From chaos to clarity: Prioritizing what mattered most

Early on, we hit a crossroads. While the team focused on driving more leads into the funnel, I noticed we were pushing users into a broken experience, essentially making the problem worse, faster. I knew we needed to change course.


Using data analysis and user feedback, I mapped out the core issues and their cascading impact on the business, then presented this framework to stakeholders to realign our approach.


The prioritization became clear: first, refresh the neglected onboarding flow with updated visuals and copy, then systematically audit the entire experience while running A/B tests to validate improvements. When fraud attempts and bot attacks emerged, this framework helped us confidently prioritize account creation and login issues as our starting point.

1.

Quick wins through strategic A/B testing

2.

Fix account creation and login issues

3.

Improve task completion

4.

Improve education and first purchase

My approach to improving the onboarding experience

Being a part of the growth and acquisition team meant that fixing and improving the onboarding flow was an iterative process. To sell the idea and get stakeholder alignment, I presented the key concepts in regular feedback sessions in order to get buy-in.

Good onboarding is like a good waiter - it's there when you need it, invisible when you don't, and leaves you feeling taken care of the entire time. 🤲

1. I first started with some quick A/B testing initiatives while doing discovery on the main flow

While larger infrastructure changes were being scoped, I identified opportunities for immediate impact through a few key A/B tests. These would test some of our assumptions and deliver business value while I did the discovery on the rest of the funnel.

A/B tests I carried out before we started digging deeper into our funnel issues

  1. Improved credit limit display

  2. New user onboarding walkthrough

  3. Visual consistency and copy changes

Biggest win: Testing a new credit limit screen layout to improve limit understanding

Hypothesis: If we show both initial and maximum potential credit amounts, users will better understand the growth opportunity and be more motivated to complete signup.

Solution: Redesigned limit display to show "Start with £X, grow to £Y" format. This is a prime example of small UI changes with great impact.

Results: +3.8% conversion, £1.054M revenue

Ran on a small percentage of users and then once we got positive feedback we ramped it up to all users.

An interesting find: Educating our users revealed how different customer types behaved

Hypothesis: If we communicate Zilch's key differentiators and how the app should be used, new users will have better understanding of the product and be more inclined to spend for the first time.

Solution: Since we wanted to improve the first spend metric, I decided to create a welcome walkthrough which showed customers how to use the app and understand the main features.


I prototyped and tested the copy and interviewed users to understand what features are unclear to them and how I can better structure the walkthrough.

Results: +2% uplift in Spend Ready -> Spend conversion especially for higher credit limit tiers. Lower tiers dismissed the walkthrough as they wanted fast access to credit.

From this I quickly realised that we need to further personalise our app depending on the type of user and credit bands.

2. Solving the authentication and account creation issues

To understand the problem I synthesised around 1k of Trustpilot reviews using AI, read CS tickets, watched user recordings and collaborated closely with my PM and especially the data team to uncover the main painpoints:

  1. One-Third of Users Hit an Invisible Wall

35% of applications fail at phone verification because phone numbers become permanently linked to unverified emails with no recovery path.

  1. Login Loops Frustrate Committed Users

Users endure an average of 4 logins before becoming M1 customers, with 76% of these authentication loops occurring during onboarding.

  1. Abandoned Applications Block Future Attempts

Previous incomplete applications prevent returning users from starting fresh, with no mechanism to "reset" their journey after a longer time away.

The solution: create a magic-link auth system with improved handling of existing accounts

I carried out a lot of competitor analysis, tested out different apps and created complex user flows while collaborating closely with my PM to implement a magic link authentication system that eliminated the dual-verification cascade. This allowed us to also verify accounts earlier and match with phone numbers without letting customers go too far into the flow.

The output

  1. Reduced authentication friction using magic link

Users provide just an email and we intelligently determine if they're logging in or creating an account using trusted device.

  1. Seamless recovery and reapplication

If a phone number is linked to another account, we notify the user securely and if the account is older than a set period of time, we expire abandoned applications.

  1. Enhanced security framework

Secure handling of returning user identification to prevent fraudulent access and better distinction between user states without compromising privacy.

Results:

  1. ±1.5k extra M1's a month with a 72.5% drop in login attempts in from 4 to 1

  2. Dramatically reduced our fraud attempts

  3. Reduced our OTP error rate from 30% to 8%

3. Improving task completion and fixing IDV drop off

Going through user session recordings, research and data I discovered that our drop-off at this stage was due to the fact that we were letting users into the app before completing all necessary tasks (IDV, add debit card). This led to confusion and a lot of rage clicks as you can see in the video. Here is the breakdown:

  1. Verification Drop-Off Creates Abandoned Journeys

40% of users never attempt IDV within 30 days, with only 65% of verified users making their first purchase within a month.

  1. Premature Storefront Access Causes Confusion

Users are exposed to spending features before verification is complete, creating misleading expectations and frustration.

  1. Credit Bureau Reporting Before Full Onboarding

Credit files are sent when IDV checks occur rather than when users have actually been issued a card and can spend.

Recording of users rage clicking before being fully verified

The solution: keep users in the onboarding flow and improve pending and returning IDV flows

I reached this solution by again working closely with my PM and dev team to improve the flows to best fit our product and tech constraints. The key to solving the drop off was to create a better explanation around why we need to do IDV, improve the workflow around edge cases by working with the KYC team and most importantly keep users inside the sign up flow until all the checks have been completed.

The output

  1. Prevent premature access to the storefront

Keep users within the sign up flow until they resume or finish their tasks. I created improved pending state screens and an easier way for users to get back onto the happy path if something went wrong. This allowed a cleaner welcome and onboarding experience.

  1. Bring notifications forward in the flow

Moving the notifications improved the retargeting efforts which in turn improved the changes of users finalising their tasks

  1. Move the Regulated Credit Agreement to the end, after IDV

Moving the RCA after IDV allowed us to set the status as registered only when the user has completed all of the mandatory tasks

Results:

  1. We increased one of our core metrics which was lead to Spend ready to 79% from 65%

  2. Decreased the number of IDV related customer service manual reviews by ±5%

  3. Contributed to the overall M1 metric (number of spending customers in a month)

What I learned from this project

This project transformed my relationship with data - I grew comfortable diving into analytics and learned from my team how to leverage insights for smarter design choices. My stakeholder management skills also sharpened significantly through frequent presentations, including several with the CEO

Some other key achievements:

  • Built a Mixpanel dashboard to track UX specific metrics such as error rates and TT

  • Helped introduce Smartlook so we can see what our customers are actually doing in the app

  • Spearheaded the design of key UI components and design system as well as helping with the launch of dark mode

  • Introduced interaction design in a few key areas and implemented Zilch's first motion principles