Authordanwolch

Accounting for Growth

You’d be crazy not to measure your active users, at the appropriate frequency for your app. That may be daily, weekly, monthly, or yearly. (DAU / WAU / MAU).

As you monitor these key metrics, you’ll want to understand how and why it’s changing. I am a big fan of how Jonathan Hsu breaks it down in accounting for growth, breaking out the components that add or subtract from your top-level numbers. Understanding the dynamics that contribute to your growth (or lack thereof) are critical for members or your team to understand what drives success. Not just the PM that owns the feature, but many of the team members.

I’ve had conversations with PMs at many companies who may know their top-level numbers, but don’t have a good handle on why they’re going up or down. As Jonathan outlines in his post, two very different businesses can have the same MAU numbers, but one is much better than another.

There are a couple of scenarios I think are important to understand:

  • If your retention rates are poor and new signups are helping you grow:
    • When will your growth rate flat line?
    • Could signups go down? What would happen if your growth flat lines or if your user base shrinks?
  • If your active user numbers go up:
    • Is that because of a one-time increase in new signups? Do you expect that to continue?
    • Is it because of resurrecting users (making dormant users come back)? Is this a sign of a pattern you could invest in making stronger? Why do people stop using it in the first place?
  • If your active user numbers go down:
    • What component decreased? Was it isolated to resurrected users / signups / retaining existing users? If it was isolated, was there a product issue that contributed to it?
  • Are you soliciting feedback from each of these buckets of users? There is gold to be mined by segmenting and surveying the users with the most valuable feedback.

In addition to knowing your high-level active user numbers, you should know why you’re growing. It’s helpful to understand what the future looks like if things stay the same or worsen. Even if you know, does the rest of your team know? When everyone understands how your product and business will be successful, they’ll have the context to more effectively make the critical decisions in building tech products.

If you’re looking to get started with this type of analysis, Jonathan Hsu’s 8-ball analysis or the lifecycle feature in Amplitude is extremely helpful.

Using analytics before any code is written

I was very lucky to have worked at HubSpot during a pivotal transition in its evolution. I managed a team of Product Analysts and data scientists, and we were charged with improving our product and the overall customer experience. We did this by analyzing what our customers did in our product.

This isn’t helpful when you are a brand new startup without a product. At that stage, you should be sitting with your customers or talking to them all the time. You should be taking advantage of your ability to do things that don’t scale.

Once you have more users than you can speak with, behavioral analytics are crucial to being successful. I often have conversations with people that aren’t sure how to apply data enhanced methodologies when launching new features. They bring up a good point:

How are you supposed to leverage behavioral analytics when the feature doesn’t exist?

It’s a good question. At this stage of your lifecycle, your most important task is to speak with customers that may be a good fit for what you want to build. While you may not have people using this new feature yet, there are usually groups of people who could use it within your existing user base. You just need to find the right people to speak with.

These are the types of questions I ask:

  • Which of your existing users are most likely to use the new feature?
  • What characteristics do those users have?
  • What actions are they taking?
  • Do you know why they’re taking the actions they are and whether this new feature would be useful to them?

Take advantage of the existing users you have and their patterns of usage. If you’re building a new feature or iterating on an existing one, it’s helpful to understand the high-intensity users or the infrequent users. They are a goldmine of information and you’d be crazy if you didn’t ask them for feedback.

Talk to your existing high intensity users and your infrequent users, they are a goldmine of information. You’d be crazy to ignore their feedback when building something new.

I typically push PMs / researchers / marketers to spend 30 minutes looking at their analytics system to pick out a group of people they want to speak with. I think that anyone involved in building new products is already overloaded with too many tasks, and this can easily feel like unnecessary work or an endless process that will take too much time. I think the key is to timebox this type of analysis and have it point you in the right direction.

I agree with this Intercom post that more than half of a PM’s time should be spent understanding customers’ problems, doing research, and thinking about how design can be applied to those problems. In order to spend that time most effectively, I think behavioral analytics systems are crucial to making sure you maximize that time and understand what your existing users are doing.

Snapchat’s Secrecy and DAU Metrics

I was pretty interested to read about Snapchat’s DAU numbers and their culture of secrecy. At first, I was pretty shocked to read about the stories employees told about the lengths the company goes to in order to keep its information private.

It’s consistent with anecdotal stories I’ve heard about Snap (they’re serious about privacy and keeping their own information a secret), but I always try to take these stories with a grain of salt.

I immediately thought about how we try to do things differently at HubSpot. Two elements in the HubSpot culture code are using metrics and transparency in the organization. I thought “we’re totally different than that here at HubSpot”, and I bet many of you thought the same thing when reading the Snapchat article. While I think we strive to be different, we’re far from perfect and are constantly trying to improve. Some of the questions I asked myself (and ways I want to hold myself and our teams accountable):

  • Does everyone in the organization have access to data (behavioral analytics, data warehouse) that helps them make better decisions?
  • For those that aren’t technical, is it accessible with non-technical tools?
  • Just because they have access to the data, do they leverage it in their ideas, analysis, and proposals?
  • Do we have sufficient documentation about how to use the data that’s available to all employees?
  • Do we make an effort to train people on using the data so they are as self sufficient as possible?
  • Do we create a culture of sharing and encouraging others to showcase their findings?
  • Do we enable others to reproduce analysis that has been done in the past?

 

While I like to think we’re better than the portrayal of Snapchat in the article, I’m not 100% satisfied with the answers to the questions above.

Choosing a behavioral analytics system: our journey to Amplitude

As part of my role at HubSpot, I run a team of analysts and data scientists that leverage quantitative analysis to inform our product development and improve the customer experience. It’s our goal to make teams self-sufficient in answering questions like: “How many people are using this feature?” or “What percentage of signups do X?” or “How sticky is this feature?”. In addition, we perform analysis and build models to identify and act upon areas of opportunity. One of the main tools we use on a daily basis is our behavioral analytics system, which helps us understand what our customers are doing inside our product.

I’ve become increasingly obsessed with behavioral analytics over the years. Here’s a brief timeline of my experience with them:

  • 2013:
    • Join HubSpot, start building a new product with Mixpanel
    • Blown away by the type of analysis it enables. Mind. Blown. It revolutionizes how I think about building products
  • 2014:
    • Grow the new product to hundreds of thousands of users, start to get nervous about our Mixpanel bill (this was a huge mistake in hindsight)
  • 2014 – 2015:
    • HubSpot decides to build its own internal behavioral analytics system
    • Rationale:
      • HubSpot is a public company at this stage, it’s a competitive advantage to have complete ownership and control over this system
      • If it costs as much as an engineer’s salary, why not pay someone to build a system customized for us?
      • We could solve our own problem, then turn the solution into a solution that could be sold to customers
  • 2016:
    • Perform a vendor assessment of our internal tool vs. a vendor (for a variety of reasons, to be explained in a future post)
    • We choose to go with Amplitude as our new behavioral analytics system
  • 2017:
    • Finish our migration to Amplitude, we currently have 250-300 HubSpotters using Amplitude on a monthly basis

Why did we pick Amplitude? Some key reasons:

  • They allowed us to create charts that count by users or by other arbitrary identifiers. Since HubSpot is a B2B company, we want to track active companies, look at the conversion rates for key actions for all users in a company, and look at company retention. Amplitude had the best solution: it allowed us to change one option in an existing chart to toggle between users and organizations. Other companies could technically solve this, but I thought it was too cumbersome.
  • They had an option to store our data in a SQL database (at the time Redshift, now it’s Snowflake). The important piece is that it allows our business intelligence team to ingest the data at a regular interval so it could be combined with other data sources. We use Looker internally, and we want to take behavioral data and combine it with financial data, CRM data, support data, and any other data loaded into our data warehouse.
  • They were focused on product analytics. We felt that their roadmap aligned perfectly with our priorities and long-term goals.
  • We had a team of 3 engineers and some of a PM’s time devoted to our internal tool. Amplitude has a much bigger engineering team and we didn’t think the customizations we would build were worth it. We felt the product team’s efforts were better spent generating value for the company, not in building a tool that was (at best and probably not the case) slightly better than Amplitude.
  • Their dashboard and behavioral cohort features were just what we wanted
  • It was fast. Our internal system had been plagued by slowness and outages (we had turnover on the team that built the internal tool and had then understaffed the team)

No solution is a panacea and I won’t say that Amplitude is perfect in every way, but I have been personally very happy with the decision we made. I’m pretty bullish on all of the companies in this space (I think they’re all powerful and worth the money), and unless there’s a fundamental shift in the technology required for these kinds of systems, I don’t want to be involved in building another one from scratch.

Flying blind: not setting or measuring product metric goals

I love building new products. Ever since I was building junky web apps as a geeky high schooler, I always get excited the first time something actually works. It has always felt like magic. Now that I’m older, I increasingly feel the pressure of showing my impact. After the initial euphoria passes, I now immediately measure the metrics that represent success. Something that has been bothering me lately is that regardless of your methodology (waterfall, agile, scrum, burndown, trello anarchy, etc), I never hear others talk enough about product success metrics.

When I joined HubSpot I learned from many others about behavioral analytics. Sadly, I find myself constantly fighting responses when I speak with friends in the industry such as:

  • “We forgot to add tracking”
  • “We want to ship it and see how it does”
  • “We don’t have any specific goals for this release other than to improve the design”
  • “What should we measure?”
  • “We can’t afford to use behavioral analytics, it’s too expensive”

This is how I want to react every time I hear one of those answers:

A guy in a panda suit breaks a computer on someone's desk

Just kidding. I am always asking questions to understand the rationale so I can try to help add perspective.

These are the tough questions I want to ask in response:

  • What’s more expensive? A behavioral analytics system or shipping the wrong features / wasting the time of your product and engineering team?
  • If you hear feedback from a couple of customers, is that representative of all users?
  • How do you know that the users are actually doing what they say they’re doing?
  • Do you think you’ll get a team’s best work if the only goal is to release their work?
  • What do you think will garner more resources in the future? “We improved the experience, just look at it!” vs. “I increased conversion rates of signup to value by 10%, with an expected lift in revenue of Y”.


I don’t think you need to spend weeks off in a corner crunching numbers to come up with the answers to these questions. My suggestion is to spend 30 minutes thinking about a goal, why you’re working on something, and then a simple mechanism to measure success.

I push teams to answer these questions:

  1. What represents success for this release/feature?
  2. What is the current baseline?
  3. What is the hypothetical ceiling of improvement?
  4. Given the baseline and ceiling, how much do you think you can improve the metric?
  5. What will be the mechanism to track success/failure?
  6. When should you evaluate progress?

You don’t have to be super fancy and build Excel models, but at least spend 15-30 minutes thinking through the basics for a new feature. Regardless if you’re building something brand new or iterating on an old feature, I always think it’s worth considering the above questions.

As the saying goes, “if you can’t measure it, you can’t improve it”.

© 2024 Dan Wolchonok

Theme by Anders NorénUp ↑