Collecting Test Metadata
CircleCI collects test metadata from XML files and uses it to provide insights into your job. This document describes how to configure CircleCI to output test metadata as XML for some common test runners and store reports with the store_test_results
step.
To see test results as artifacts, upload them using the store_artifacts
step.
The usage of the store_test_results
key in your config looks like the following:
- store_test_results:
path: test-results
Where the path
key is an absolute or relative path to your working_directory
containing subdirectories of JUnit XML or Cucumber JSON test metadata files. Make sure that your path
value is not a hidden folder (example: .my_hidden_directory
would be an invalid format).
If you are using CircleCI Server, after configuring CircleCI to collect your test metadata, tests that fail most often appear in a list on the Insights page in the CircleCI application where you can identify flaky tests and isolate recurring issues.
The above screenshot applies to CircleCI Server only.
If you are using CircleCI Cloud, see the API v2 Insights endpoints to find test failure information.
Enabling formatters
Test metadata is not automatically collected in CircleCI 2.0 until you enable the JUnit formatters. For RSpec, Minitest, and Django, add the following configuration to enable the formatters:
- RSpec requires the following be added to your gemfile:
gem 'rspec_junit_formatter'
- Minitest requires the following be added to your gemfile:
gem 'minitest-ci'
- Django should be configured using the django-nose test runner.
Note: For detailed information on how to test your iOS applications, refer to the Testing iOS Applications on macOS page.
Metadata collection in custom test steps
Write the XML files to a subdirectory if you have a custom test step that produces JUnit XML output as is supported by most test runners in some form, for example:
- store_test_results:
path: /tmp/test-results
Custom test runner examples
This section provides the following test runner examples:
- Cucumber
- Maven Surefire
- Gradle
- Mocha
- Ava
- ESLint
- PHPUnit
- pytest
- RSpec
- test2junit
- trx2junit
- Karma
- Jest
Cucumber
For custom Cucumber steps, you should generate a file using the JUnit formatter and write it to the cucumber
directory. Following is an example of the addition to your .circleci/config.yml
file:
steps:
- run:
name: Save test results
command: |
mkdir -p ~/cucumber
bundle exec cucumber --format junit --out ~/cucumber/junit.xml
when: always
- store_test_results:
path: ~/cucumber
- store_artifacts:
path: ~/cucumber
The path:
is a directory relative to the project’s root directory where the files are stored. CircleCI collects and uploads the artifacts to S3 and makes them available in the Artifacts tab of the Job page in the application.
Alternatively, if you want to use Cucumber’s JSON formatter, be sure to name the output file that ends with .cucumber
and write it to the /cucumber
directory. For example:
steps:
- run:
name: Save test results
command: |
mkdir -p ~/cucumber
bundle exec cucumber --format pretty --format json --out ~/cucumber/tests.cucumber
when: always
- store_test_results:
path: ~/cucumber
- store_artifacts:
path: ~/cucumber
Maven Surefire Plugin for Java JUnit Results
If you are building a Maven based project, you are more than likely using the Maven Surefire plugin to generate test reports in XML format. CircleCI makes it easy to collect these reports. Add the following to the .circleci/config.yml
file in your project.
steps:
- run:
name: Save test results
command: |
mkdir -p ~/test-results/junit/
find . -type f -regex ".*/target/surefire-reports/.*xml" -exec cp {} ~/test-results/junit/ \;
when: always
- store_test_results:
path: ~/test-results
- store_artifacts:
path: ~/test-results/junit
Gradle JUnit Test Results
If you are building a Java or Groovy based project with Gradle, test reports are automatically generated in XML format. CircleCI makes it easy to collect these reports. Add the following to the .circleci/config.yml
file in your project.
steps:
- run:
name: Save test results
command: |
mkdir -p ~/test-results/junit/
find . -type f -regex ".*/build/test-results/.*xml" -exec cp {} ~/test-results/junit/ \;
when: always
- store_test_results:
path: ~/test-results
- store_artifacts:
path: ~/test-results/junit
Mocha for Node.js
To output junit tests with the Mocha test runner you can use mocha-junit-reporter.
A working .circleci/config.yml
section for testing might look like this:
steps:
- checkout
- run: npm install
- run: mkdir ~/junit
- run:
command: mocha test --reporter mocha-junit-reporter
environment:
MOCHA_FILE: ~/junit/test-results.xml
when: always
- store_test_results:
path: ~/junit
- store_artifacts:
path: ~/junit
Mocha with nyc
Following is a complete example for Mocha with nyc, contributed by marcospgp.
version: 2
jobs:
build:
environment:
CC_TEST_REPORTER_ID: code_climate_id_here
NODE_ENV: development
docker:
- image: circleci/node:8
auth:
username: mydockerhub-user
password: $DOCKERHUB_PASSWORD # context / project UI env-var reference
environment:
MONGODB_URI: mongodb://admin:password@localhost:27017/db?authSource=admin
- image: mongo:4.0
auth:
username: mydockerhub-user
password: $DOCKERHUB_PASSWORD # context / project UI env-var reference
environment:
MONGO_INITDB_ROOT_USERNAME: admin
MONGO_INITDB_ROOT_PASSWORD: password
working_directory: ~/repo
steps:
- checkout
# Update npm
- run:
name: update-npm
command: 'sudo npm install -g npm@latest'
# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "package-lock.json" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: npm install
- run: npm install mocha-junit-reporter # just for CircleCI
- save_cache:
paths:
- node_modules
key: v1-dependencies-{{ checksum "package-lock.json" }}
- run: mkdir reports
# Run mocha
- run:
name: npm test
command: ./node_modules/.bin/nyc ./node_modules/.bin/mocha --recursive --timeout=10000 --exit --reporter mocha-junit-reporter --reporter-options mochaFile=reports/mocha/test-results.xml
when: always
# Run eslint
- run:
name: eslint
command: |
./node_modules/.bin/eslint ./ --format junit --output-file ./reports/eslint/eslint.xml
when: always
# Run coverage report for Code Climate
- run:
name: Setup Code Climate test-reporter
command: |
# download test reporter as a static binary
curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter
chmod +x ./cc-test-reporter
./cc-test-reporter before-build
when: always
- run:
name: code-coverage
command: |
mkdir coverage
# nyc report requires that nyc has already been run,
# which creates the .nyc_output folder containing necessary data
./node_modules/.bin/nyc report --reporter=text-lcov > coverage/lcov.info
./cc-test-reporter after-build -t lcov
when: always
# Upload results
- store_test_results:
path: reports
- store_artifacts:
path: ./reports/mocha/test-results.xml
- store_artifacts:
path: ./reports/eslint/eslint.xml
- store_artifacts: # upload test coverage as artifact
path: ./coverage/lcov.info
prefix: tests
Ava for Node.js
To output JUnit tests with the Ava test runner you can use the TAP reporter with tap-xunit.
A working .circleci/config.yml
section for testing might look like the following example:
steps:
- run:
command: |
yarn add ava tap-xunit --dev # or you could use npm
mkdir -p ~/reports
ava --tap | tap-xunit > ~/reports/ava.xml
when: always
- store_test_results:
path: ~/reports
- store_artifacts:
path: ~/reports
ESLint
To output JUnit results from ESLint, you can use the JUnit formatter.
A working .circleci/config.yml
test section might look like this:
steps:
- run:
command: |
mkdir -p ~/reports
eslint ./src/ --format junit --output-file ~/reports/eslint.xml
when: always
- store_test_results:
path: ~/reports
- store_artifacts:
path: ~/reports
PHPUnit
For PHPUnit tests, you should generate a file using the --log-junit
command line option and write it to the /phpunit
directory. Your .circleci/config.yml
might be:
steps:
- run:
command: |
mkdir -p ~/phpunit
phpunit --log-junit ~/phpunit/junit.xml tests
when: always
- store_test_results:
path: ~/phpunit
- store_artifacts:
path: ~/phpunit
pytest
To add test metadata to a project that uses pytest
you need to tell it to output JUnit XML, and then save the test metadata:
- run:
name: run tests
command: |
. venv/bin/activate
mkdir test-results
pytest --junitxml=test-results/junit.xml
- store_test_results:
path: test-results
- store_artifacts:
path: test-results
RSpec
To add test metadata collection to a project that uses a custom rspec
build step, add the following gem to your Gemfile:
gem 'rspec_junit_formatter'
And modify your test command to this:
steps:
- checkout
- run: bundle check --path=vendor/bundle || bundle install --path=vendor/bundle --jobs=4 --retry=3
- run: mkdir ~/rspec
- run:
command: bundle exec rspec --format progress --format RspecJunitFormatter -o ~/rspec/rspec.xml
when: always
- store_test_results:
path: ~/rspec
Minitest
To add test metadata collection to a project that uses a custom minitest
build step, add the following gem to your Gemfile:
gem 'minitest-ci'
And modify your test command to this:
steps:
- checkout
- run: bundle check || bundle install
- run:
command: bundle exec rake test
when: always
- store_test_results:
path: test/reports
See the minitest-ci README for more info.
test2junit for Clojure Tests
Use test2junit to convert Clojure test output to XML format. For more details, refer to the sample project.
trx2junit for Visual Studio / .NET Core Tests
Use trx2junit to convert Visual Studio / .NET Core trx output to XML format.
A working .circleci/config.yml
section might look like this:
steps:
- checkout
- run: dotnet build
- run: dotnet test --no-build --logger "trx"
- run:
name: test results
when: always
command: |
dotnet tool install -g trx2junit
export PATH="$PATH:/root/.dotnet/tools"
trx2junit tests/**/TestResults/*.trx
- store_test_results:
path: tests/TestResults
- store_artifacts:
path: tests/TestResults
destination: TestResults
Karma
To output JUnit tests with the Karma test runner you can use karma-junit-reporter.
A working .circleci/config.yml
section might look like this:
steps:
- checkout
- run: npm install
- run: mkdir ~/junit
- run:
command: karma start ./karma.conf.js
environment:
JUNIT_REPORT_PATH: ./junit/
JUNIT_REPORT_NAME: test-results.xml
when: always
- store_test_results:
path: ./junit
- store_artifacts:
path: ./junit
// karma.conf.js
// additional config...
{
reporters: ['junit'],
junitReporter: {
outputDir: process.env.JUNIT_REPORT_PATH,
outputFile: process.env.JUNIT_REPORT_NAME,
useBrowserName: false
},
}
// additional config...
Jest
To output JUnit compatible test data with Jest you can use jest-junit.
A working .circleci/config.yml
section might look like this:
steps:
- run:
name: Install JUnit coverage reporter
command: yarn add --dev jest-junit
- run:
name: Run tests with JUnit as reporter
command: jest --ci --runInBand --reporters=default --reporters=jest-junit
environment:
JEST_JUNIT_OUTPUT_DIR: ./reports/junit/
- store_test_results:
path: ./reports/junit/
- store_artifacts:
path: ./reports/junit
For a full walkthrough, refer to this article by Viget: Using JUnit on CircleCI 2.0 with Jest and ESLint. Note that usage of the jest cli argument --testResultsProcessor
in the article has been superseded by the --reporters
syntax, and JEST_JUNIT_OUTPUT has been replaced with JEST_JUNIT_OUTPUT_DIR
and JEST_JUNIT_OUTPUT_NAME
, as demonstrated above.
Note: When running Jest tests, please use the --runInBand
flag. Without this flag, Jest will try to allocate the CPU resources of the entire virtual machine in which your job is running. Using --runInBand
will force Jest to use only the virtualized build environment within the virtual machine.
For more details on --runInBand
, refer to the Jest CLI documentation. For more information on these issues, see Issue 1524 and Issue 5239 of the official Jest repository.
API
To access test metadata for a run from the API, refer to the test-metadata API documentation.