Collecting Test Metadata
CircleCI collects test metadata from XML files and uses it to provide insights into your job. This document describes how to configure CircleCI to output test metadata as XML for some common test runners and store reports with the store_test_results step.
To see test result as artifacts, upload them using the store_artifacts step.
After configuring CircleCI to collect your test metadata, tests that fail most often appear in a list on the details page of Insights in the application to identify flaky tests and isolate recurring issues.

Enabling Formatters
Test metadata is not automatically collected in CircleCI 2.0 until you enable the JUnit formatters. For RSpec, Minitest, and Django, add the following configuration to to enable the formatters:
- RSpec requires the following be added to your gemfile:
gem 'rspec_junit_formatter'
- Minitest requires the following be added to your gemfile:
gem 'minitest-ci'
- Django should be configured using the django-nose test runner.
Metadata Collection in Custom Test Steps
Write the XML files to a subdirectory if you have a custom test step that produces JUnit XML output as is supported by most test runners in some form, for example:
- store_test_results:
path: /tmp/test-results
Custom Test Runner Examples
This section provides the following test runner examples:
Cucumber
For custom Cucumber steps, you should generate a file using the JUnit formatter and write it to the cucumber directory. Following is an example of the addition to your .circleci/config.yml file:
steps:
- run:
name: Save test results
command: |
mkdir -p ~/cucumber
bundle exec cucumber --format junit --out ~/cucumber/junit.xml
when: always
- store_test_results:
path: ~/cucumber
- store_artifacts:
path: ~/cucumber
The path: is a directory relative to the project’s root directory where the files are stored. CircleCI collects and uploads the artifacts to S3 and makes them available in the Artifacts tab of the Job page in the application.
Alternatively, if you want to use Cucumber’s JSON formatter, be sure to name the output file that ends with .cucumber and write it to the /cucumber directory. For example:
steps:
- run:
name: Save test results
command: |
mkdir -p ~/cucumber
bundle exec cucumber pretty --format json --out ~/cucumber/tests.cucumber
when: always
- store_test_results:
path: ~/cucumber
- store_artifacts:
path: ~/cucumber
Maven Surefire Plugin for Java JUnit Results
If you are building a Maven based project, you are more than likely using the
Maven Surefire plugin
to generate test reports in XML format. CircleCI makes it easy to collect these
reports. Add the following to the .circleci/config.yml file in your
project.
steps:
- run:
name: Save test results
command: |
mkdir -p ~/junit/
find . -type f -regex ".*/target/surefire-reports/.*xml" -exec cp {} ~/junit/ \;
when: always
- store_test_results:
path: ~/junit
- store_artifacts:
path: ~/junit
Gradle JUnit Test Results
If you are building a Java or Groovy based project with Gradle,
test reports are automatically generated in XML format. CircleCI makes it easy to collect these
reports. Add the following to the .circleci/config.yml file in your
project.
steps:
- run:
name: Save test results
command: |
mkdir -p ~/junit/
find . -type f -regex ".*/build/test-results/.*xml" -exec cp {} ~/junit/ \;
when: always
- store_test_results:
path: ~/junit
- store_artifacts:
path: ~/junit
Mocha for Node.js
To output junit tests with the Mocha test runner you can use mocha-junit-reporter
A working .circleci/config.yml section for testing might look like this:
steps:
- checkout
- run: npm install
- run: mkdir ~/junit
- run:
command: mocha test --reporter mocha-junit-reporter
environment:
MOCHA_FILE: junit/test-results.xml
when: always
- store_test_results:
path: ~/junit
- store_artifacts:
path: ~/junit
Ava for Node.js
To output JUnit tests with the Ava test runner you can use the TAP reporter with tap-xunit.
A working .circleci/config.yml section for testing might look like the following example:
steps:
- run:
command: |
yarn add ava tap-xunit --dev # or you could use npm
mkdir -p ~/reports
ava --tap | tap-xunit > /reports/ava.xml
when: always
- store_test_results:
path: ~/reports
- store_artifacts:
path: ~/reports
ESLint
To output JUnit results from ESLint, you can use the JUnit formatter.
A working .circleci/config.yml test section might look like this:
steps:
- run:
command: |
mkdir -p ~/reports
eslint ./src/ --format junit --output-file ~/reports/eslint.xml
when: always
- store_test_results:
path: ~/reports
- store_artifacts:
path: ~/reports
PHPUnit
For PHPUnit tests, you should generate a file using the --log-junit command line option and write it to the /phpunit directory. Your .circleci/config.yml might be:
steps:
- run:
command: |
mkdir -p ~/phpunit
phpunit --log-junit ~/phpunit/junit.xml tests
when: always
- store_test_results:
path: ~/phpunit
- store_artifacts:
path: ~/phpunit
pytest
To add test metadata to a project that uses pytest you need to tell it to output JUnit XML, and then save the test metadata:
- run:
name: run tests
command: |
. venv/bin/activate
mkdir test-reports
pytest --junitxml=test-reports/junit.xml
- store_test_results:
path: test-reports
- store_artifacts:
path: test-reports
RSpec
To add test metadata collection to a project that uses a custom rspec build step, add the following gem to your Gemfile:
gem 'rspec_junit_formatter'
And modify your test command to this:
steps:
- checkout
- run: bundle check --path=vendor/bundle || bundle install --path=vendor/bundle --jobs=4 --retry=3
- run: mkdir ~/rspec
- run:
command: bundle exec rspec --format progress --format RspecJunitFormatter -o ~/rspec/rspec.xml
when: always
- store_test_results:
path: ~/rspec
Minitest
To add test metadata collection to a project that uses a custom minitest build step, add the following gem to your Gemfile:
gem 'minitest-ci'
And modify your test command to this:
steps:
- checkout
- run: bundle check || bundle install
- run:
command: bundle exec rake test
when: always
- store_test_results:
path: test/reports
See the minitest-ci README for more info.
test2junit for Clojure Tests
Use test2junit to convert Clojure test output to XML format. For more details, refer to the sample project.
Karma
To output JUnit tests with the Karma test runner you can use karma-junit-reporter.
A working .circleci/config.yml section might look like this:
steps:
- checkout
- run: npm install
- run: mkdir ~/junit
- run:
command: karma start ./karma.conf.js
environment:
JUNIT_REPORT_PATH: ./junit/
JUNIT_REPORT_NAME: test-results.xml
when: always
- store_test_results:
path: ~/junit
- store_artifacts:
path: ./junit
// karma.conf.js
// additional config...
{
reporters: ['junit'],
junitReporter: {
outputDir: process.env.JUNIT_REPORT_PATH,
outputFile: process.env.JUNIT_REPORT_NAME,
useBrowserName: false
},
}
// additional config...
Jest
To collect Jest data,
first create a Jest config file called jest.config.js with the following:
// jest.config.js
{
reporters: ["default", "jest-junit"],
}
In your .circleci/config.yml,
add the following run steps:
steps:
- run:
name: Install JUnit coverage reporter
command: yarn add --dev jest-junit
- run:
name: Run tests with JUnit as reporter
command: jest --ci --reporters=default --reporters=jest-junit
environment:
JEST_JUNIT_OUTPUT: "reports/junit/js-test-results.xml"
For a full walkthrough, refer to this article by Viget: Using JUnit on CircleCI 2.0 with Jest and ESLint.
Note: When running Jest tests, please use the --runInBand flag. Without this flag, Jest will try to allocate the CPU resources of the entire virtual machine in which your job is running. Using --runInBand will force Jest to use only the virtualized build environment within the virtual machine.
For more details on --runInBand, refer to the Jest CLI documentation. For more information on these issues, see Issue 1524 and Issue 5239 of the official Jest repository.
Merging Test Suites Together
If you have multiple JUnit test reports from running more than one test suite or runner, you can merge them together using the third-party NodeJS CLI tool, junit-merge.
This tool can combine the reports into a single file that our test summary system can parse and give you correct test totals.
API
To access test metadata for a run from the API, refer to the test-metadata API documentation.