Week Ten

I’m building a product called Client Tree. It’s an app that helps freelancers find clients by word of mouth.

I’ve been tracking user retention to guide product development. Retention measures how many people come back and use an app after they first sign up.

Screenshot 2019-12-30 at 5.37.21 PM.png

At the moment, you can see that 20% of the people who sign up continue to use the app a week after they create an account. These people continued to use the product for about 6 weeks and then stopped using it altogether.

To try and fix this I implemented a fancy email onboarding feature and something called a hustle meter about 3 weeks ago. Both features seem to have had no impact on retention.

I am not happy with how I am measuring the impact of the features I am building. I don’t know exactly what impact the email onboarding feature and the hustle meter had.

What I would like is to be able to compare the retention scores of people with different features. Something a bit more like…

Screenshot 2019-12-30 at 5.50.33 PM.png

Where each of the colours represents groups of users who have different features turned on. That way I can see which features actually have an impact on retention without having to piss everyone off with every feature.

To do this I’m going to need to implement some kind of feature manager. There will no longer be a single version of the app. Everyone will see a slightly different version of the app when they sign up based on what features they have access to. I want some kind of dashboard to manage things because I can see this getting complicated fast. I had a look at the different options and decided to go with split.io, mainly because it integrates with segment.io and has a free tier. I’m fairly sure I could roll my own feature manager but that’s just unnecessary.

The next step is to come up with a few hypotheses for why people stop using the product. I got in touch with people who stopped using the app and asked them why they stopped. My understanding is that the problem is that the getting started steps don’t conceptually connect to the dashboard you see when you sign up.

To fix this I am going to change the default dashboard so that there is a more obvious connection to the getting started steps. However, the key is that I am not going to roll this change out to everyone. I’m going to put the change behind a feature flag and then only release the change to the next 20 people that sign up to see what impact it has on the onboarding process.

Based on the results I can either roll it out to more people or test a different hypothesis.


Now read this

Improving Conversion For Small Businesses That Don’t Have Enough Traffic For AB Testing - Week 2 Review

Two weeks ago I got accepted to CXL institutes’ conversion optimisation mini-degree scholarship. It claims to be one of the most thorough conversion rate optimisation training programs in the world. The program runs online and covers 74... Continue →