Client
Zen Educate
Timeframe
Sep 2022 - Jun 2023
Client
Native App Design, Service Design, Visual Design, User Research, B2C
How might we guide new users through the onboarding process?
I've learned that the success of hosting a great dinner party relies on three crucial factors: clear instructions leading up to the event, a comfortable and engaging atmosphere, and a warm welcome to make your guests feel at home.
The onboarding process to an app is like greeting your guests at the door and guiding them through your home. If not done correctly, guests can leave feeling disheartened and unsatisfied.
New users at Zen Educate were not sticking around for long, so we launched a 6-month project to revamp the onboarding process to increase the percentage of newly registered teachers completing the full vetting process within the first 30 days of registration from 15% to 20%.
Early Exploration
To kick off the project, I conducted a workshop with our product manager, tech lead, and operations lead. Together, we defined the scope, identified risky assumptions, and aligned our expectations for the discovery phase of the project.
We identified three key assumptions that we needed to validate by the end of the discovery phase:
1. Users lose motivation going through onboarding:
Zen Educate marketing materials promise teachers higher rates and jobs tailored to their preferences. However, teachers weren't able to see jobs until they'd been fully vetted. We expected this to be a key factor in why user retention during onboarding was so low.
2. The onboarding process is too long:
Only 8% of newly signed-up teachers uploaded their necessary documents within the first month of joining Zen Educate. We suspected this low conversion rate was because users perceived the onboarding process as far too long and dropped off as a result.
3. The onboarding process is too manual:
At Zen Educate, the operations team is the backbone of the company. They provide one-on-one support to teachers and schools across 600 locations in the UK and the US. However, it was estimated that 80% of the time spent by operations is managing onboarding. To make this process more efficient and cost-effective, we assumed we may want to move to a more automated onboarding.
User Research
To verify these assumptions, I conducted two rounds of user interviews: The first round involved participants currently going through the onboarding process. The second round focused on individuals who had completed onboarding and had worked their first day with Zen Educate in the past 2 weeks.
These interviews aimed to understand users' expectations, needs, and challenges during onboarding. I also wanted to gain insight into how Zen Educate's onboarding process compares to its competitors.
By the end of these first two rounds, key patterns started emerging:
All participants expressed feeling overwhelmed by the amount of manual effort required to complete the vetting process.
Participants mentioned feeling unsupported and confused about using the digital platform.
All users stated their biggest pain point was being unable to see available open roles upon sign-up.
All users stated their biggest motivation for completing vetting was discussing available opportunities with a member of the operations team.
Our Hypotheses Moving Forward
After the initial research, I held a session with the broader product team to present interview findings, identify patterns, and generate design recommendations based on the challenges. We then organised these ideas into relevant categories and formulated three main hypotheses to guide our design efforts.
1. Enhanced Job Postings: Displaying relevant job opportunities for unvetted users will help them understand what makes Zen Educate amazing. This would increase their motivation to get fully vetted and start working!
2. Work-Driven Onboarding: When a user finds a job that they love on the platform, reminding them to complete their vetting process will help them understand the value of completing it.
3. Alignment with User Expectations: Making sure that the onboarding process aligns with user expectations reduces confusion and keeps users happy.
Time to Design
Service designs
Although my role focused on the digital user experience, it was clear that creating the best onboarding experience was going to mean revisiting the entire end-to-end user journey. So I began mapping out a few different models, clearly defining errors, risks and unknowns associated with each one to help stakeholders make a more informed decision. Following multiple rounds of user research and workshops with operations, we opted for a digital-driven onboarding model with operational support. This hybrid approach mitigated the risks associated with a fully digital onboarding model while lightening some of the load on the operations team.

Enhanced job postings
During this project, our team adopted an iterative design process, testing prototypes of our latest thinking with our users to develop a product in collaboration with them, ensuring that it is tailored to their needs and preferences.
Firstly, I attacked the redesign of the job cards. The initial feedback from user interviews was that participants found the original job cards to be uninviting and felt that the design did not effectively highlight the key motivators they considered when applying for a role. Throughout multiple design iterations, we refined the job cards to enhance their appeal and functionality, incorporating user feedback at each stage. The final design featured improved visual hierarchy and strategic use of spacing and layout to draw attention to critical elements such as salary, location, and working hours—factors most significant to our users.

Next, we focused on enhancing the job cards by expanding them into individual pages that offered more detailed information about each role. During interviews, participants highlighted they'd be more likely to apply for a job if they could read through feedback from other teachers and learn more about the school's ethos directly from the app.
Additionally, by isolating each job to its own page, the product team would be able to more effectively track user navigation paths and gather precise data on which keywords and information drew the most attention. This would be incredibly valuable for ensuring future iterations were driven by user insights and clear data.

Work-Driven Vetting
Off the back of foundational research, we hypothesised that if candidates saw a clear link between the tasks they were asked to complete and the jobs they desired, they would be more likely to engage deeply and quickly with the process. This hypothesis we coined with the snappy name "work-driven vetting".
To explore this idea, we set up a structured experiment using a prototype on the Maze platform. Our main objectives were straightforward: evaluate how easily candidates could navigate to the work tab, monitor the time it took them to complete vetting tasks and measure the completion rates when these tasks were explicitly linked to job postings. The prototype we developed was designed to mimic the real job application experience, integrating vetting steps into the workflow. This setup was vital for isolating specific variables and focusing on the interactions critical to our hypothesis.

Three rounds of testing and 600 participants later, we saw some fantastic results. We saw a 43% increase in user interactions with job postings, a clear sign of improved engagement. Even more encouraging was the reduction in time-on-task for vetting steps, which was cut in half, significantly smoothing the process for candidates. Overall, 61% of participants reported they would complete all vetting in one go, reflecting the efficiency and clarity of the new design.
However, our journey wasn't without its hurdles. The main challenge lay in the limitations of prototype testing, which, while offering valuable insights, doesn't fully capture the nuances of real-world behaviour. However, we accepted this limitation, understanding that the insights gained still provided valuable directional data for further iterations.
In Conclusion
After rolling out the MVP of our designs, we monitored it for several weeks. As a result, we saw an uplift of 38% of candidates interacting with our more complex job descriptions. Additionally, within the first few weeks after launch, we saw an uplift from 8% to 13% of candidates self-uploading their DBS, smashing our KPI.
Just like hosting a dinner party, having more guests staying for a longer time is not an indicator of their satisfaction. While the onboarding process has been restructured and is showing great improvement in user retention, the next step should be to ensure that users are enjoying the experience, finding the process delightful, and feeling confident in completing it themselves. Only then can we fully transition to a digital-led model.