Re-run failed tests only (preview)
On This Page
- Motivation and introduction
- Prerequisites
- Quickstart
- Before: Example .circleci/config.yml file
- After: Example .circleci/config.yml file
- Verify the configuration
- Additional examples
- Configure a job running Ruby (rspec) tests
- Configure a job running Ruby (Cucumber) tests
- Configure a job running Cypress tests
- Configure a job running Javascript/Typescript (Jest) tests
- Configure a job running Playwright tests
- Known limitations
- FAQs
This feature is in preview, use at your own risk. This feature is not guaranteed to move to general availability. For questions and/or issues, please comment on our Community forum. |
Motivation and introduction
CircleCI is releasing new preview functionality to re-run failed tests only. When selecting this option (see image below), only a subset of tests are re-run, instead of re-running the entire test suite when a transient test failure arises.
Historically, when your testing job in a workflow has flaky tests, the only option to get to a successful workflow was to re-run your workflow from failed. This type of re-run executes all tests from your testing job, including tests that passed, which prolongs time-to-feedback and consumes credits unnecessarily.
This re-run failed tests only option re-runs failed tests from the same commit, not new ones.

Prerequisites
-
Your testing job in the workflow is configured to upload test results to CircleCI.
file
orclassname
attributes must be present in the JUnit XML output. -
The testing job uses
circleci tests run
(see below for details) to execute tests.If your current job is using intelligent test splitting, you must change the circleci tests split
command tocircleci tests run
combined with a--split-by=
flag (see below for instructions).
Quickstart
Before: Example .circleci/config.yml
file
- run:
name: Run tests
command: |
mkdir test-results
TEST_FILES=$(circleci tests glob "**/test_*.py" | circleci tests split --split-by=timings)
pytest -o junit_family=legacy --junitxml=test-results/junit.xml $TEST_FILES
- store_test_results:
path: test-results
This example snippet is from a CircleCI configuration file that:
-
Executes Python test files that end in
.py
, -
Splits tests by previous timing results (you can follow this tutorial on intelligent test splitting),
-
Stores the test results in a new directory called
test-results
, and -
Uploads those test results to CircleCI.
Note: -o junit_family=legacy
is present to ensure that the test results being generated contain the file
attribute. Not included in this snippet is the key to set parallelism (read the Test splitting and parallelism page for more information).
After: Example .circleci/config.yml
file
In the snippet below, the example has been updated to use the circleci tests run
command to allow for re-running only failed tests.
- run:
name: Run tests
command: |
mkdir test-results
TEST_FILES=$(circleci tests glob "**/test_*.py")
echo $TEST_FILES | circleci tests run --command="xargs pytest -o junit_family=legacy --junitxml=test-results/junit.xml" --verbose --split-by=timings #--split-by=timings is optional, only use if you are using CircleCI's test splitting
- store_test_results:
path: test-results
-
TEST_FILES=$(circleci tests glob "*/test_.py")
Use CircleCI’s glob command to put together a list of test files. In this case, we are looking for any test file that starts with
test_
and ends with.py
. Ensure that the glob string is enclosed in quotes. -
echo $TEST_FILES |
Pass the list of test files to the
circleci tests run
command as standard input (stdin
). -
circleci tests run --command="xargs pytest -o junit_family=legacy --junitxml=test-results/junit.xml" --verbose --split-by=timings
-
Invoke
circleci tests run
and specify the original command (pytest
) used to run tests as part of the--command=
parameter. This is required.xargs
must be present as well. -
--verbose
is an optional parameter forcircleci tests run
which enables more verbose debugging messages. -
Optional:
--split-by-timings
enables intelligent test splitting by timing forcircleci tests run
. Note that this is not required in order to usecircleci tests run
. If your testing job is not using CircleCI’s test splitting, omit this parameter.
-
Verify the configuration
After updating your configuration, run the job that runs tests again and make sure that the same number of tests are being executed as before the config.yml change.
Then, the next time you encounter a test failure on that job, click the "Re-run failed tests only" button. If the --verbose
setting is enabled, you should see output similar to the following the next time you click "Rerun failed tests only" with this job on CircleCI:
Installing circleci-tests-plugin-cli plugin.
circleci-tests-plugin-cli plugin Installed. Version: 1.0.5976-439c1fc
DEBU[2023-05-18T22:09:08Z] Attempting to read from stdin. This will hang if no input is provided.
DEBU[2023-05-18T22:09:08Z] rerunning failed tests
INFO[2023-05-18T22:09:08Z] starting execution
DEBU[2023-05-18T22:09:08Z] received test names: ****
If you see rerunning failed tests
present in the step’s output, the functionality is configured properly.
The job should only re-run tests that are from a classname
of file
that had at least one test failure when the "Re-run failed tests only" button is clicked. If you are seeing different behavior, comment on this Discuss post for support.
Additional examples
Configure a job running Ruby (rspec) tests
-
Add the following gem to your Gemfile:
gem 'rspec_junit_formatter'
-
Modify your test command to use
circleci tests run
:- run: mkdir ~/rspec - run: command: | circleci tests glob "spec/**/*_spec.rb" | circleci tests run --command="xargs bundle exec rspec --format progress --format RspecJunitFormatter -o ~/rspec/rspec.xml" --verbose --split-by=timings
-
Update the
glob
command to match your use case. See the RSpec section in the Collect Test Data document for details on how to output test results in an acceptable format forrspec
. If your current job is using CircleCI’s intelligent test splitting, you must change thecircleci tests split
command tocircleci tests run
with the--split-by=timings
parameter. If you are not using test splitting,--split-by=timings
can be omitted.
Configure a job running Ruby (Cucumber) tests
-
Modify your test command to look something similar to:
- run: mkdir -p ~/cucumber - run: command: | circleci tests glob "features/**/*.feature" | circleci tests run --command="xargs bundle exec cucumber --format junit --out ~/cucumber/junit.xml" --verbose --split-by=timings
-
Update the
glob
command to match your use case. See the Cucumber section in the Collect Test Data document for details on how to output test results in an acceptable format forCucumber
. If your current job is using CircleCI’s intelligent test splitting, you must change thecircleci tests split
command tocircleci tests run
with the--split-by=timings
parameter. If you are not using test splitting,--split-by=timings
can be omitted.
Configure a job running Cypress tests
-
Use the cypress-circleci-reporter (note this is a 3rd party tool that is not maintained by CircleCI). You can install in your
.circleci/config.yml
or add to yourpackage.json
. Example for adding to.circleci/config.yml
:#add required reporters (or add to package.json) -run: name: Install coverage reporter command: | npm install --save-dev cypress-circleci-reporter
-
Use the
cypress-circleci-reporter
,circleci tests run
, and upload test results to CircleCI:-run: name: run tests command: | mkdir test_results cd ./cypress npm ci npm run start & circleci tests glob "cypress/**/*.cy.js" | circleci tests run --command="xargs npx cypress run --reporter cypress-circleci-reporter --spec" --verbose --split-by=timings" #--split-by=timings is optional, only use if you are using CircleCI's test splitting - store_test_results path: test_results
Remember to modify the
glob
command for your specific use case. If your current job is using CircleCI’s intelligent test splitting, you must change thecircleci tests split
command tocircleci tests run
with the--split-by=timings
parameter. If you are not using test splitting,--split-by=timings
can be omitted.
Configure a job running Javascript/Typescript (Jest) tests
-
Install the
jest-junit
dependency. You can add this step in your.circleci/config.yml
:- run: name: Install JUnit coverage reporter command: yarn add --dev jest-junit
You can also add it to your
jest.config.js
file by following these usage instructions. -
Modify your test command to look something similar to:
- run: command: | npx jest --listTests | circleci tests run --command=“xargs npx jest --config jest.config.js --runInBand --” --verbose --split-by=timings environment: JEST_JUNIT_OUTPUT_DIR: ./reports/ JEST_JUNIT_ADD_FILE_ATTRIBUTE: true - store_test_results: path: ./reports/
-
Update the
npx jest --listTests
command to match your use case. See the Jest section in the Collect Test Data document for details on how to output test results in an acceptable format forjest
. If your current job is using CircleCI’s intelligent test splitting, you must change thecircleci tests split
command tocircleci tests run
with the--split-by=timings
parameter. If you are not using test splitting,--split-by=timings
can be omitted.JEST_JUNIT_ADD_FILE_ATTRIBUTE=true
is added to ensure that thefile
attribute is present.JEST_JUNIT_ADD_FILE_ATTRIBUTE=true
can also be added to yourjest.config.js
file instead of including it in.circleci/config.yml
, by using the following attribute:addFileAttribute= "true"
.
Configure a job running Playwright tests
-
Modify your test command to use
circleci tests run
:- run: command: | mkdir test-results #can also be switched out for passing PLAYWRIGHT_JUNIT_OUTPUT_NAME directly to Playwright pnpm run serve & TESTFILES = $(circleci tests glob "specs/e2e/**/*.spec.ts") echo $TESTFILES | circleci tests run --command="xargs pnpm playwright test --config=playwright.config.ci.ts --reporter=junit" --verbose --split-by=timings
-
Update the
glob
command to match your use case. If your current job is using CircleCI’s intelligent test splitting, you must change thecircleci tests split
command tocircleci tests run
with the--split-by=timings
parameter.. If you are not using test splitting,--split-by=timings
can be omitted. Note: you may also use Playwright’s built-in flag (PLAYWRIGHT_JUNIT_OUTPUT_NAME
) to specify the JUnit XML output directory.
Known limitations
-
When re-running only the failed tests, test splitting by timing may not be as efficient as expected the next time that job runs, as the test results being stored are only from the subset of failed tests that were run.
-
Orbs that run tests may not work with this new functionality at this time.
-
If a shell script is invoked to run tests,
circleci tests run
should be placed in the shell script itself, and not.circleci/config.yml
. -
Jobs that are older than the retention period for workspaces for the organization cannot be re-run with "Re-run failed tests only".
-
Jobs that upload code coverage reports may see issues during a re-run.
FAQs
Question: I have a question or issue, where do I go?
Answer: Leave a comment on the Discuss post.
Question: Will this functionality re-run individual tests?
Answer: No, it will re-run failed test classnames
or file
that had at least one individual test failure.
Question: What happens if I try to use the functionality but circleci tests run
is not used in the .circleci/config.yml
file?
Answer: All tests will be executed when the workflow runs again, including failed tests. This is equivalent to selecting "Rerun workflow from failed".
Question: What happens if I try to use the functionality and circleci tests run
is used in my .circleci/config.yml
file, but I have not configured my job to upload test results to CircleCI?
Answer: The job will fail.
Question: When can I click the option to "Re-run failed tests only?"
Answer: Currently, it will be present any time the "Re-run workflow from failed" option is present, and vice versa.
Question: I don’t see my test framework on this page, can I still use the functionality?
Answer: Yes, as long as your job meets the prerequisites outlined above. The re-run failed tests only functionality is test runner- and test framework-agnostic. You can use the methods described in the Collect test data document to ensure that the job is uploading test results. Note that classname
and file
is not always present by default, so your job may require additional configuration.
From there, follow the Quickstart section to modify your test command to use circleci tests run
.
If you run into issues, comment on the Discuss post.
Question: Can I see in the web UI whether a job was re-run using "Re-run failed tests only"?
Answer: Not at this time.
Question: What happens when a job is re-run and it is using parallelism and test splitting?
Answer: The job will spin up the number of containers/virtual machines (VMs) that are specified with the parallelism
key. However, the step that runs tests for each of those parallel containers/VMs will only run a subset of tests, or no tests, after the tests are split across the total number of parallel containers/VMs. For example, if parallelism
is set to eight, there may only be enough tests after the test splitting occurs to "fill" the first two parallel containers/VMs. The remaining six containers/VMs will still start up, but they will not run any tests when they get to the test execution step.
Question: My maven surefire tests are failing when I try to set this feature up?
Answer: You may need to add the -DfailIfNoTests=false
flag to ensure the testing framework ignores skipped tests instead of reporting a failure when it sees a skipped test on a dependent module.
Question: Can I specify timing type for test splitting using circleci tests run
?
Answer: Yes, you can specify the timing type similar to circleci tests split --split-by=timings --timings-type=
using a test-selector
flag. You can pass filename
, classname
, or testname
.
Help make this document better
This guide, as well as the rest of our docs, are open source and available on GitHub. We welcome your contributions.
- Suggest an edit to this page (please read the contributing guide first).
- To report a problem in the documentation, or to submit feedback and comments, please open an issue on GitHub.
- CircleCI is always seeking ways to improve your experience with our platform. If you would like to share feedback, please join our research community.
Need support?
Our support engineers are available to help with service issues, billing, or account related questions, and can help troubleshoot build configurations. Contact our support engineers by opening a ticket.
You can also visit our support site to find support articles, community forums, and training resources.
CircleCI Documentation by CircleCI is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.