From Zero to One Hundred: Growth Experiments for Startups

Zain Raza
Dev Genius
Published in
9 min readNov 9, 2020

--

Image from the Carbon0 Games opening video. Image source: https://youtu.be/jsZ-aOp5SwM

When It’s Time to Launch

“Can you explain what’s going on?”
This is the one text message you dread hearing as a software engineer — this was coming to me from my manager, the CTO of Carbon0 Games. It was at 7:56 PM, and our MVP was set to launch the next morning. Time to start debugging…

Disregarding my last-minute scramble to push bug fixes in time for a moment, the Carbon0 MVP has been one of my finest software products to date — I led the development as both a full-stack engineer and product manager, and in coordination with three other developers (Cao Mai, Henry Calderon, and Aleia Knight). It took us 10 weeks to build, with over 300,000 lines of code (including the code needed to integrate Google Analytics and Mixpanel). So it should be successful, right?

Right?

Vanity vs. Real Metrics

Wrong!

It’s time we talked about one of the hardest-fought lessons I’ve learned as a product developer — data lies. It has been less than two weeks since we launched Carbon0, and it has already proven at least 10 times as challenging to gain traction as I expected it to be.

My ideas, beliefs, and hopes about the product all contained unseen biases, and part of why I’m writing this blog post is to explain why we made such poor assumptions, and focused on the wrong things initially — vanity metrics.

What’s a Vanity Metric, you may be asking?

Why, I just gave you an example of two!

“It took us 10 weeks to build, with over 300,000 lines of code.”

Do you notice how the two numbers in the statement above, don’t necessarily have any connection to the product being successful? It just sounds like a really impressive number to throw out there about the project.

Simply put, a vanity metric is just that — in the world of startups however, it may just be a little harder to distinguish these deceptive numbers from what we do want to focus on, which are real metrics (e.g. numbers that do indicate your product is gaining traction).

How to Find Your Key Metric?

How will you possibly sift through all your data? Photo by Charles Deluvio on Unsplash

There are a lot of numbers to track in the product development— and to be honest, I was vehemently opposed to the idea of learning them all in some unwieldy, cumbersome statistics course.

What did help me realize the metrics our product team needed to focus on, was realizing at the end of the day, there is only one of two things a product needs to generate, in order for the business to grow:

  1. REVENUE — the amount of money you earn from charging users
  2. USAGE — the amount of users, and how much time they spend on the app

Carbon0 Games is currently an app free for everyone. So fortunately, our team used this framework to decide numbers related to usage. For the rest of this blog, I will focus on the following two in particular, and how we used growth experiements to increase them:

  1. number of times the app was shared on Facebook
  2. number of achievements Carbon0 users had achieved

With that being said, time for some growth experiments!

But What is a Growth Experiment?

Having defined my key metrics, and with a product out in the world, I knew I needed a way to track growth that would be as objective as possible. That’s where the growth experiment comes in.

As the name suggests, the growth experiment is just that — it’s a way for you test different avenues where you may find new users. I had multiple ideas for how to reach customers in the beginning, and I wasn’t sure how to prioritize our team’s marketing efforts. I found this approach helped me find which one worked best, so we could double down on it!

How to Setup a Growth Experiment

There are essentially four variables to define in every growth experiment, which must be defined beforehand:

  1. Audience — who are the users you’re targeting?
  2. Channel — where will you go/what platform will you use to acquire these new users?
  3. Message — what will you show these users that will interest them?
  4. Metric — what will you measure, in order to gauge the success of the experiment?

We’ve already defined the metrics I wanted to use for my growth experiments. I did separate growth experiments for both of the metrics — so now let’s go over those!

Increasing Facebook Shares— Growth Experiment 1

The first experiment I want to share with you, was all about increasing the virality of our product.

Experiment Setup

  • The Audience we decided to target: students in high school and college ages (18–26) who were passionate about climate justice.
  • The Channels we utilized: Twitter, Facebook, Hacker News, Product Hunt, and a handful of personal DM’s and SMS messages.
  • The Message: for personal connections of mine, I decided to just reach out casually and ask them to try out the app. For the rest, we wanted to go with something along the lines of “Stuck at home? No time to make an impact? We made an easier way for you and your friends to make an impact.” Just to help speak to the presumed desire of young people to make a difference.
  • The Metric, as stated before, was the number of times the app was shared on Facebook.

The Result

The experiment was set to run for a week.Our engineering team had built a Facebook Share button right into the site, so it would be as easy as possible for users to share us with their friends. For a week I diligently went ahead with promoting our product on the channels noted above, and I believed we were all set.

I was even somewhat optimistic — I checked Google Analytics during this time, and from the graph it was clear we were acquring traffic:

Another example of a vanity metric: within a week of launching, Carbon0 increased number of visitors by 1,400%!

At the time, this graph was exciting to me. We had gone from zero to over a hundred users! On top of that, Carbon0 increased number of visitors by 1,400%! What joy!

But as you may guess, this blog post is all about the mistakes I have made in growth engineering. In hindsight, the graph above is ultimately another example of a vanity metric. In contrast, here’s the graph that awaited me in Mixpanel, when I checked it on the day after the experiment ended, to see how many shares we had actually earned:

At the end of the experiment, people had only tried to share our app on Facebook a total of 3 times from the site :(

At the end of the experiment, people had only tried to share our app on Facebook a total of 3 times from the site.

Key Takeaway

Seeing the graph from Mixpanel, I finally realized it’s probably more important for us to focus on product development before virality— the best explanation for why people weren’t sharing, was most probably because they didn’t like it themselves in the first place.

This experiment helped me as a developer see how growth experiments can actually inform the direction of engineering. So with that in mind, how about we take a look at another growth experiment we did at Carbon0?

“If your product requires advertising or salespeople to sell it, it’s not good enough: technology is primarily about product development, not distribution.” — Peter Thiel, Zero to One (2014)

Increasing Player Usage — Growth Experiment 2

Our next experiment was intended to discover how we could increase the amount players were using the core feature of the site — being able to go on “missions” to reduce their carbon footprint.

Experiment Setup

  • The Audience we decided to target: the same as before. In addition, for this experiment we also decided to go after the parents of young people, particularly children in their pre-teen years. One the cofounders wanted to use the app to teach her own children about climate change, and so we had a hunch there might be more parents who would want to do the same.
  • The Channels we utilized: same as in Experiment 1.
  • The Message: the message was the same as before for the young people. For the parent demographic, we decided to use the following copy: “Wish you could educate your kids about climate change? We’d love to show you our game!”
  • The Metric, as stated before, was the number of times one of the carbon missions was completed.

The Result

This time around, the Mixpanel graph was a lot more interesting:

The amount users completed missions went up and down for the most part — nothing consistent.

Although the graph shows definite activity, I concluded it doesn’t tell us much for two reasons:

  1. The amount users completed missions went up and down for the most part — nothing consistent.
  2. We admittedly made a mistake in data collection, because we forgot to remove Mixpanel tracking from the developers who were testing out the app on their local machines. Therefore, some of the events on the graph above is not from actual users, but people within Carbon0 (which obviously makes it better than it actually is).

Key Takeaway

At first glance, this second experiment didn’t quantitatively give us much good news either. However there is some silver lining to the story, in terms of qualitative feedback we got.

Doing Things That Don’t Scale
This time around I made the effort to be much more personalized in the messages I sent out to folks. No more mindless blasting on social media — for example I reached out to parents I knew in my hometown, to the librarian of the high school I went to, and to people at different colleges, who I knew from high school and I knew they were majoring in Environmental Science.

While it took more time, people were at least much more receptive to me, and I was able to at least glean feedback on the UI/UX of the site most of the time. Our team will now go back and work on iterating the product, and making adjustments so we can gain better traction in the future.

Double Down on What Works
The second key takeaway has to do with traffic — Facebook was our most receptive channel by far, and the cofounders at Carbon0 have picked up on that. Our growth experiements helped reinforce their decision to double down their efforts on marketing to the Carbon0 Facebook page. The CTO has even started a social media challenge, in which they people can post images of themselves playing the game each day, much like this:

Image source: https://www.facebook.com/MyCarbon0/photos/a.105309610979807/209707053873395/

Conclusion

To give you the background, I initially came into this project as a wide-eyed, inexperienced Program Manager— and at the time I came there was no one in the Carbon0 organization who even knew what Mixpanel was, let alone had set up KPIs and was tracking weekly goals to make sure that traction was improving. I quickly had to move from being the hesitant, unassuming PM; and turning into the one leading our team’s efforts to become a more data-driven, more evidence-based engineering organization.

Taking account of the mistakes I’ve made at Carbon0, I’ve asked myself many times whether it’d be the right idea to quit. I frequently ask, “is it really worth pushing forward?”

Everyone may have their own opinion on this situation. I personally take wisdom from the Greek proverb, “A society grows great when old men plant trees whose shade they know they shall never sit in.” While we are nowhere close to where we want to be, I’m grateful to have at least gotten real insights into our product. It’s progress. Startup traction has become just a wee bit less of an art, and more of a science to me — and that’s something I also find comfort in.

“A society grows great when old men plant trees whose shade they know they shall never sit in.” — Greek proverb

Most of all, I know we need to keep our heads focused on the things we can actually control — product development and talking to users. Climate change is not going away anytime soon, so neither will Carbon0!

--

--

Software engineer interested in A.I., and sharing stories on how tech can allow us to be more human.