The term ‘growth hacking’ implies speed – it’s one of those phrases that sounds like it’s in motion.
So we hack away, like Beethoven writing a masterpiece in the attic.
Rushing to quickly understand which of the 19 growth channels will work for your product. To understand why your visitors aren’t converting. To product-market fit.
Sean Ellis’ concept of high-tempo testing suggests that the number of growth experiments a team runs each week is a key indicator of success.
More = better.
But underneath the hood, what he’s actually advocating, is control over your pace – suggesting you launch 3 tests per week.
The growth marketer is more like a drummer than the frantic composer – keeping a consistent pace of experimentation going.
That steady cadence produces an equally steady stream of ‘growth fuel’: data.
Fueling the growth engine
Data is the running water that allows our tests to keep running – because if you don’t learn anything from the test you just ran, how can you decide what to test next (besides guessing)?
So your ability as a growth marketer, to deliver clean, tap-reliable data to the table is mission critical.
Without it, you throw a wrench in the engine of growth.
Every time you have to ask a developer to pull data for you, you throw a wrench.
Every time you can’t explain test results to your team, you throw a wrench.
Just like a competent drummer can play the bass drum, bongos or timpani without missing a beat, a competent growth marketer can analyze data to draw conclusions, no matter the data source.
Assembling your stick bag
The further away from you this data is kept, the more difficult it’ll be to run tests.
Adam Savage (one half of the Mythbusters team) refers to his workshop organization principle as ‘first-order accessibility’, meaning everything he needs, for any foreseeable current job, is immediately accessible to him, rather than buried under some other tool, or kept in storage, or under the front seat of his car.
A drummer’s stick bag contains all of the sticks, sleighbells, or small cowbells that they’ll need for a drum session.
As growth marketers, we can follow the same principle.
If we keep the core metrics that we’ll need to analyze a test front-and-center, we’ll spend much less time fumbling around for data, which gives us more time to run tests, etc.
A virtuous cycle: the best kind.
When to deploy your sticks
Data teardown: tracking a growth experiment’s KPIs
You’ll always want to have a current snapshot of your KPIs no more than one click away.
That way, when you want to run a test, all you need to do is pick which KPI you’re attempting to move, vs even thinking about how you’ll pull data.
Let’s walk through a simple test example end-to-end:
- Objective: Increase an app’s activation KPI, by introducing a drip email campaign.
- Hypothesis: Educating users about how to make use of the product will help them activate at higher rates.
- Experiment Design: Implement a 4-part drip email series in Mailchimp, and measure its impact on our activation KPI – which is queried from our app’s Postgres database.
- Results: For the couple weeks after implementation, track the daily change in your activation KPI, as users begin to receive drip messages.
- Analysis + Learnings: Did it move the needle? Since you have activation KPI data close at hand, you can devote time digging deeper into open/click rates for each email, and coming up with ideas for what to test next.
By picking a KPI and having easy access to it up front, you’ll be able to focus strictly on the test itself (writing and designing an engaging drip campaign), vs reporting and communicating progress to your team.
Data teardown: tracking ad-hoc or campaign KPIs
Here’s where it gets interesting. When you’re working in a team, growth experiments can often happen without specifically planning them.
A designer will rework your blog post layout, a community manager will pin a tweet, or a press outlet will feature you without notice.
Then the questions start rolling in:
- How are signups from that press mention activating so far?
- How has time on page changed since we reformatted our blog pages?
- What’s our conversion rate on referral traffic from Twitter since we pinned that tweet?
A good growth marketer will be able to quickly pull data together, given her knowledge of basic statistics, SQL, spreadsheet data manipulation (especially in Blockspring), and the tools that her team uses.
Let’s walk through that first example above:
How are signups from that press mention activating so far?
We immediately hit a data wall: my activation KPIs are tracked out of my app’s Postgres database, but my per-user referral metrics live in Mixpanel. How can I tie the two together?
Thankfully, in 2016, there’s always a way to pull your own data – my weapon of choice is Blockspring.
So to answer that question, we’d:
- Run a Blockspring SQL query to pull per-user activation metrics for signups since the press mention.
- Run Mixpanel’s Python script to export ‘people’ records (each of which contains a referral source).
- Tie the two together into a table, using your choice of vlookup, index-match, or query spreadsheet formulas.
Once that question gets asked, and the drumsticks to measure results are acquired, you now have a new set of sticks stowed in your bag.
I love getting asked these questions, since answering them increases the speed with which we can run future tests.
How your team hears you
You’re not much of a drummer if you’re just playing for yourself – your job is to anchor the rhythm of the entire band!
That means you have to get data on your experiments + KPIs out to your team, on a regular basis.
I’ve found the best way to do that is by sharing (over Slack or email) a weekly message, containing key KPIs + experiment progress.
This lets the consistent data that you’ve created wash over your entire team, which is key for being able to consistently generate new ideas (in my experience, developers & designers who aren’t too close to the distribution trenches have the most original marketing ideas).
So…what’s in your bag?
Through much iteration, I’ve built a stick bag that keeps all of my data first-order accessible, and keeps me from wasting any time pulling numbers.
It’s a simple Google Spreadsheet (modeled on Kevin Kaye’s growth engine framework), that helps me answer a couple key questions:
- What tests are currently running, for which target KPIs? How is that KPI tracking before and since the experiment started?
- How have KPIs evolved over time? By snapping all metrics nightly, and tracking daily / weekly / monthly progress.
It takes some Blockspring setup to query your KPIs, and it’s certainly not the most beautiful KPI dashboard in the world, but it’s worked for me.
Let’s build the ultimate drumstick kit
I want to publish a version of this template, but, since KPIs are so personal, I’d like to enlist your help to make it as customizable as possible out of the box (assuming you’re still reading, you must be a growth data nerd like me).
My template connects to only a couple data sources to pull KPIs:
- Postgres database (for platform activation, retention, and revenue KPIs)
- Google Analytics (for acquisition KPIs)
I’m curious to hear, what’s your favorite set of drumsticks for keeping track of growth progress? And when do you take those sticks out of your bag to monitor a KPI?
If possible, I’d love to weave those drumsticks into the KPI dashboard template, so that you could copy the Google Sheet and immediately make use of it.
Feel free to leave your ideas in the comments, or access the CIFL template vault below to take a peek into our bag of tricks.