On the Growth engineering team at CircleCI, we regularly run UI experiments that are large and small, complex and simple. Each one has revealed more about how and when our customers use CircleCI, and how we can make their experience better.

Our team has three values that we abide by in designing these experiments: incrementation, curiosity, and adding real value to our users.

Here’s a sampling of how those values have become experiments and what we’ve learned from each.

Incrementation: small UX annoyances make a big impact

When new Growth engineers join our team, we ask them to onboard by getting a project set up on CircleCI and writing down all their points of confusion or frustration.

One engineer noted that when he hovered over a tooltip explaining parallelism on the jobs page, it took several seconds to pop-up:

Our team built that original tooltip as part of a different test, but because we prioritize small, incremental improvements, we used an out-of-the-box browser feature at the time (to get faster feedback) rather than building our own proper tooltip component. From that experiment, we discovered that giving the user more information about parallelism leads to more organizations upgrading their level of parallelism.

If we built an even better tooltip, would we see higher use of parallelism?

We ran an A/B test to turn the browser’s native tooltip (which we were using as a hack) into a true, custom tooltip that pops up instantly:

Again, we observed a significant lift of organizations in the treatment group (with the new tooltip) leveraging parallelism compared to the control group (from 12.5% to 14.9 with a p=0.00 — you can’t get more statistically significant than that!)

It was such a small UX adjustment that we originally debated whether to run the test at all. We didn’t think it would change anything, but it did show a serious lift.

By incrementing, we got to learn more about how our users interact with the jobs page. We also learned a broader truth: small UX improvements make a big difference.

Curiosity: learn what users are trying to do and help them do it faster

Over time, we’ve found that some attempts to help users improve their productivity have not worked well. For example, we ran what we thought was a very compelling educational campaign that showed users they could recover faster from failed pipelines by running smaller commits more often.

2021_02_02_running-pipelines.png

To our surprise, it seemed that the treatment group who saw the campaign ran fewer pipelines on average than those who had not seen it.

We were curious if the results had anything to do with the fact that we had attempted to change the user’s existing workflow rather than accelerating their current workflow. If we ran a different experiment that stayed within the user’s existing flow, perhaps the results would improve.

We had learned that most users want to rerun pipelines, they just don’t want to conduct multiple extra clicks to do so. Our new A/B experiment showed the ‘rerun a pipeline’ button on the top-level dashboard — the page users interact with most frequently. This wasn’t a new functionality but we wanted to know if putting the button in a more central location would prompt users to rerun pipelines more often.

Our instincts were right — the change resulted in a higher rate of rerunning pipelines.

We also discovered that by putting the rerun button in a central location, more organizations could see that rerunning was an option, and a statistically significant number of new organizations started using this functionality for the first time.

The contrast between these two results shows that aligning our experiments to our users’ current workflow is more effective than trying to get them to do something new or different. This is a conclusion we’ve seen reinforced over and over with more experiments.

Add value: reduce ‘time to joy’

We learned from both data and UX research sessions that some new users were taking a long time to get set up. This led to them not taking full advantage of the engineering resources available to optimize their pipelines, or in some cases, leaving the platform entirely.

With input from our Developer Relations team, we decided to try adding a “Getting started” checklist to the right side panel of the UI. The checklist included morale-boosting items like ‘run your first pipeline,’ and ‘run your first green workflow,’ as well as reading more advanced documentation on configuring CircleCI.

The goal here was to reduce the time it takes for users to experience the full value of CircleCI — or “reduce time to joy” as we like to refer to it.

2021_02_02_getting-started.png

This experiment resulted in a 27% lift in organizations reaching our threshold for being engaged within three weeks of joining the platform.

We learned that reducing time to joy has a real impact for our users. If we can speed up their onboarding and help them feel less discouraged along the way, they can get to work faster, which adds real value to them, their team, and their larger organization.

And that’s what it’s really about: we run these engagement experiments to understand how to make CircleCI work better for our users. As we continue to run tests, we continue to learn what our users need to make them more productive, successful, and happy.

Do you like the sound of these values? Are you interested in doing cool experiments like this? Our team is hiring! Check out the latest open role on the CircleCI Growth team.