Note from the publisher: You have managed to find some of our old content and it may be outdated and/or incorrect. Try searching in our docs or on the blog for current information.


CircleCI provides useful information around the performance of your jobs and tests. It can tell you when you ran your last run, your median job time, queue time, and success rate, along with some graphs to help you visualize things.

Here’s what that looks like for all projects right out of the gate.

junit1

How many times have you wondered: Which of my tests in a run have failed? Which one of my tests has failed the most? Which one of my failed tests runs the slowest?

CircleCI can help you with that as well. Let’s take a look at where to find the info!

The failed and slowest tests

Insights can give us a summary of not only what test suite has been failing, but it also breaks it down by test name and even frequency of failures. If you’ve got flaky tests or some heavy repeat offenders, they’ll be here on this list for you to examine and squash. Similar to the frequency data, Insights can also provide you with a ranking of what tests take the longest when they fail. This data helps you easily spot tests that may be hitting timeouts.

So where can we find this data? If you’re on the Insights page with me, you’re already there. All we need to do is scroll down:

junit2

You may be asking, “Where’s the data, Jayson?” There’s a little set up we need to do before we unlock the full potential of Insights. The steps needed depends on your setup, but for our example, it will only be two steps. Let’s check it out together.

Unlocking the power

The tests

For this article, I wrote up a demo project that runs some very basic “tests” in Cucumber and Ruby’s RSpec. Each tool has a set of the same three tests:

  • Always pass - Assert true is true.
  • Frequently pass - Generate a random number and assert whether or not it is less than a threshold. Currently 0.95.
  • Less frequently pass - Same as the previous test but its threshold is set to 0.85.

This will give us some intermittent test failures providing us with some variability in our test results. If you’re curious, here’s what the step definitions look like:

Given('this step passes') {}

Given('this step often passes') do
  expect(Random.new.rand(1.0)).to be < 0.95
end

Given('this step less often passes') do
  expect(Random.new.rand(1.0)).to be < 0.85
end

JUnit

The first thing we need to do is get our testing tools to output in the JUnit XML format. What is JUnit XML? Well, JUnit is itself a unit testing framework for the Java programming language and it outputs its results in the Apache Ant JUnit XML format. If you’re not a Java developer, that’s perfectly alright, too, as it’s quite common for tools across languages to implement their own formatter that follows the JUnit XML schema. For our example, I’ll be showing how to do this for RSpec and Cucumber, but CircleCI provides a list of other common options that you can use, too.

Here’s an example .circleci/config.yml file that we’ll be modifying together to get better Insights results:

version: 2.1
executors:
  default:
    docker:
      - image: circleci/ruby:2.5
commands:
  install_gems:
    description: "install dependencies"
    steps:
      - run: bundle check --path=vendor/bundle || bundle install --path=vendor/bundle --jobs=4 --retry=3
  rspec:
    description: "Run RSpec"
    steps:
      - run:
          command: bundle exec rspec
          when: always
  cucumber:
    description: "Run cucumber"
    steps:
      - run: 
          command: bundle exec cucumber
          when: always
jobs:
  test:
    executor: default
    steps:
      - checkout
      - install_gems
      - rspec
      - cucumber
workflows:
  pr:
    jobs:
      - test

It’s a 2.1 config but don’t let that scare you, what we’re doing here is available in lower versions, too.

Right now, our commands for RSpec and Cucumber are quite simple: bundle exec rspec and bundle exec cucumber, respectively. This allows the tools to output their results to stdout so that they can use their default formatting. It’s great for running locally, but not so great when you want to easily see what has failed when your workflows run on CircleCI. For our six total tests, reading the console’s test output wouldn’t be too bad, but for larger test suites, this could be a difficult task.

Both of these tools allow for outputting in the JUnit format quite easily.

RSpec

  • Add the rspec_junit_formatter gem to your Gemfile.
  • Add on --format progress --format RspecJunitFormatter -o ~/test-results/rspec/rspec.xml to your RSpec command.
    Note: You don’t need the initial --format progress, but if you don’t keep it, RSpec will only output to the rspec.xml file, not stdout. The same goes for Cucumber.

Cucumber

  • Add -f pretty -f junit -o ~/test-results/cucumber/cucumber.xml to your Cucumber command.
    • While Cucumber will create XML files for each of the feature files you run, we’ll still want to have cucumber.xml at the end of the path.

Here’s our updated .circleci/config.yml:

version: 2.1
executors:
  default:
    docker:
      - image: circleci/ruby:2.5
commands:
  install_gems:
    description: "install dependencies"
    steps:
      - run: bundle check --path=vendor/bundle || bundle install --path=vendor/bundle --jobs=4 --retry=3
  rspec:
    description: "Run RSpec"
    steps:
      - run:
          command: bundle exec rspec --format progress --format RspecJunitFormatter -o ~/test-results/rspec/rspec.xml
          when: always
  cucumber:
    description: "Run cucumber"
    steps:
      - run: 
          command: bundle exec cucumber -f pretty -f junit -o ~/test-results/cucumber/cucumber.xml
          when: always
jobs:
  test:
    executor: default
    steps:
      - checkout
      - install_gems
      - rspec
      - cucumber
workflows:
  pr:
    jobs:
      - test

Here’s an example of the XML output for our RSpec tests showing that the first two tests passed and the third one failed:

<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="rspec" tests="3" skipped="0" failures="1" errors="0" time="0.017001" timestamp="2019-01-24T22:17:57-07:00" hostname="JS-PC">
<properties>
<property name="seed" value="29522"/>
</properties>
<testcase classname="spec.demo_1_spec" name="demo 1 with tests for demonstration always passes" file="./spec/demo_1_spec.rb" time="0.001000"></testcase>
<testcase classname="spec.demo_2_spec" name="demo 2 with tests for demonstration sometimes passes" file="./spec/demo_2_spec.rb" time="0.001000"></testcase>
<testcase classname="spec.demo_3_spec" name="demo 3 with tests for demonstration more often passes" file="./spec/demo_3_spec.rb" time="0.014001"><failure message="expected: &lt; 0.85
     got:   0.9199158781050316" type="RSpec::Expectations::ExpectationNotMetError">Failure/Error: expect(r.rand(1.0)).to be &lt; 0.85
  expected: &lt; 0.85
       got:   0.9199158781050316
./spec/demo_3_spec.rb:8:in `block (3 levels) in &lt;top (required)&gt;&apos;</failure></testcase>
</testsuite>

With those changes made, step 1 is complete! 🎉 On to step 2.

Where are your JUnit test files?

We’ve got our testing tools outputting in the correct format and they’re saving that output to files in our project. The next step is to put them to use by telling CircleCI where it can find them. This is done with the built-in command store_test_results.

The command store_test_results takes in a path and is our way of saying to CircleCI, “Hey, here’s where my test data is.” We’ll want to ensure that this path is the same as the one where our commands have saved the files. This path value can either be absolute or relative, but heads up, if you’re using a relative location, it’s based on your current working directory.

Here’s what adding the store_test_results step to our commands looks like in our config:

rspec:
    description: "Run RSpec"
    steps:
      - run:
          command: bundle exec rspec --format progress --format RspecJunitFormatter -o ~/test-results/rspec/rspec.xml
          when: always
      - store_test_results:
          path: ~/test-results/rspec/
  cucumber:
    description: "Run cucumber"
    steps:
      - run: 
          command: bundle exec cucumber -f pretty -f junit -o ~/test-results/cucumber/cucumber.xml
          when: always
      - store_test_results:
          path: ~/test-results/cucumber/

Johnny, show them what they’ve won!

A brand new motor scoot- nope! Not a new scooter, but some useful data that can help clue us in about how things are going for our test suites. Let’s try scrolling down on our Insights page now:

Drum roll, please…

junit3

Nice, we did it! Given the test result data, CircleCI will automatically begin analyzing and displaying this data for us. As mentioned before, Insights can show us data around the most failed tests, their suite, failure frequency, as well as similar data around our slowest failed tests. The names under Test Suite vary based on the tool, with Cucumber outputting the feature name and RSpec using a name that includes the file name of the spec file.

Here’s a steady image so you can see more:

junit4

But wait, there’s more!

The Insights page isn’t the only area that benefits from storing test data. By default, the Test Summary tab within jobs will only show you a button to the documentation around store_test_results.

junit5

With expanded Insights, our summaries now do, too. If a test fails in a job of yours, data around those failures will be bubbled up to the top in informational boxes that look like this:

junit6

These boxes will tell you what tool and specific test failed as well as some information around the assertion, too. If a particular tool has multiple failures, they’ll be grouped together in the same box for ease of viewing. The blue See Most Failed Tests button will take you straight to the Insights page.

If all your tests pass, you’ll also be provided with some summary data, including notes about how many tests ran and for what tools.

junit7

Note: To ensure that you’re not seeing messages like, “Your job ran 6 tests in unknown,” you will want to make sure that you are creating directories for each testing tool you are working with, like this:

test-results
├── cucumber
│   └── cucumber.xml
└── rspec
    └── rspec.xml

If you’re following the examples above, I’ve already set you up to do this, so you shouldn’t have any issues with unknowns showing up.

Two-step approach

  1. Output test results in JUnit format
  2. Tell CircleCI where those results are with the store_test_results command

These two simple steps have enabled CircleCI to provide us with a wealth of information and greater insight into the health and performance of our tests. Now you have the tools to start storing test results in your config.yml files from here on out, and I highly recommend you do.


Gemini Smith is a continuously learning software engineer with a passion for testing, communication, and improving the software development lifecycle. Primarily writing in Go, she also contributes to the software and testing communities via public speaking, advocacy, consulting, and mentorship.

Read more posts by Gemini Smith