Data has the power to demonstrate the impact and value of design — an ever-growing topic in the design world. Tools like Amplitude and Optimizely can enable designers to understand and measure their impact. As a designer who embraces growth, my team and I have learned a lot about applying a growth mindset to some of our recent initiatives.

Create goals and find focus

Design is far more impactful when you focus on a goal for growth. Knowing your growth metrics will drive business value and will help you understand the impact of your experiments. Flying blind makes it difficult to pinpoint what’s happening in the ecosystem and as the company scales, this problem will persist in larger ways.

Prioritize work that will drive your team towards its goal and minimize distractions like bugs, tech debt and other general clean up cards. As Lex Roman says, “When you’re trying to move signups, retention, revenue, referrals, you have to laser focus on that thing. Now, as a designer, you may not be leading goal setting, but if your team is taking on five goals, you’re not going to hit them. Being able to help your team focus on the high impact stuff is critical to success.”

Define problems and investigate

Work with your product manager and data analyst to investigate problem areas and opportunities using analytics. User journey drop-offs? Organizations not becoming engaged? Why? What are signs of engaged and retained organizations? Early on I found it difficult to understand quantitative data reports, so don’t be shy about asking for help.

Use Amplitude to understand drop-off rates between each step in the user journey and if the data indicates the problem is big enough, make it a priority to find a solution. Our team uses Amplitude Extension to confirm event and property names, and create charts to analyze. Funnel Analysis, Event Segmentation and Pathfinder chart types are useful for showing user paths. Follow up this data investigation with qualitative research to better understand pain points and growth opportunities on those paths.

Qualitative research is also helpful for finding anecdotes and investigating whether they are backed by data. These can form experiment ideas and hypotheses. Some of our qualitative research showed that setup and configuration on CircleCI were problematic. Users were hitting errors on their first build, like having their config file located in the wrong repo location or incorrectly naming the directory. Our data analyst then confirmed that errors relating to “no config” made up the majority of why first Pipeline runs failed.

Understand which levers are impacting metrics and to what degree, so you can amplify impact and drive growth. Paolo Ertreo touches on this in his great presentation.

Credit to Paolo Ertreo for the slide.

Create an experiment board

Create a backlog or board of experiment ideas with columns based on your growth focus areas like activation, engagement and retention. Cards are backed up with data from qualitative and quantitative research, and every team member is encouraged to submit an idea. Providing as much background data as possible will help form a hypothesis. Our best experiment cards include:

  • Problem: What problem are you solving and how big of a problem is it?
  • Hypothesis: If you do this one thing, what do you expect to happen as a result?
  • Metrics of Success: How will you know if your experiment is successful? What outcomes are you trying to positively move?
  • Open Questions: What questions will help move this experiment forward?

Limit the scope and design

Limit the scope of the experiment as they will vary in size. In one of our larger experiments, we enabled users to add a configuration file directly to their VCS from CircleCI. We limited the scope to avoid managing conflicts with existing branches and branch protection. We also applied these other tips:

  • Utilize existing components
  • Copy is important. Work with your copywriter.
  • Don’t design for every edge case (one error modal to rule them all!)
  • Polish later

Measure the impact

Make sure to name user events accurately in Amplitude, and avoid sending too many events in case it leads to complexity. Look at user journeys to understand the steps people have and have not completed, and ask specific questions.

We’ve been using Optimizely for experiments, which allows you to add your primary goals and metrics. When experiments become statistically significant, review metric performance as a team. Ask what the data is showing and why users are making these choices. Then, identify areas for improvement and talk through action, like whether you should implement the feature permanently or iterate on the experiment and improve discoverability.

Bonus: Stay aligned

Ask yourself if this change could apply to other areas of the product or indicate a helpful change to your design system. Document your learnings for use in future experiment strategies, for example, why component A is a better notification mechanism than component B.

The reward of embracing data

As my team has applied these approaches to our experiment process, we’ve seen positive results. When we enabled people to add a configuration file directly to their VCS from CircleCI, it inspired this very blog post and showed impact highlights for new users and orgs:

  • New feature adoption by +56%
  • Increased Pipeline activation by +43%
  • Increased the success of an org’s first Pipeline by +150%
  • Decreased config-related build failures by -47%

Embracing data lets you backup your design decisions and prove that design itself made an impact. We’d love your feedback on whether you and your team find these tips helpful! Reach out on Twitter and sign up to participate in future research sessions here.