Fail fast - Open preview
This feature is in Open preview, use at your own risk. This feature is not guaranteed to move to general availability. For questions and/or issues, please comment on our Community forum. |
Motivation and introduction
For jobs with lengthy running tests, where even one failed test requires immediate attention, waiting for feedback until all tests have run to completion results in wasted time and unoptimized credit usage.
CircleCI is releasing new open preview functionality to fail tests faster. You can configure your testing jobs to stop executing shortly after finding the first test failure.
Unlike fail-fast functionality built into specific test runners, the configuration settings described below are compatible out-of-the-box with CircleCI’s intelligent test splitting and parallelism. Previously, it was non-trivial for a user to instruct CircleCI to terminate a parallel run ( task) because a separate parallel run (task) found a test failure. The functionality described below is generic to any test runner.
Use cases
-
Testing jobs with flaky tests.
-
Long-running (15+ minutes) testing jobs.
-
Users who are looking to optimize credit consumption.
-
Test suites with interdependent tests (for example, certain types of end-to-end tests suites).
-
Adding new tests as part of new functionality to a project (for example, on a feature branch).
-
On a main branch, where even one test failure in CI requires immediate attention.
Quick-start
Before: Example .circleci/config.yml file
- run:
name: Run tests
command: |
mkdir test-results
pytest --junitxml=test-results/junit.xml
A snippet of a basic CircleCI configuration file that executes Python tests and stores the test results in a new directory so they can be uploaded to CircleCI.
After: Example .circleci/config.yml file
jobs:
build:
docker:
- image: cimg/python:3.11.0
steps:
- checkout
- python/install-packages:
pkg-manager: pip
- run:
name: Run tests
command: |
mkdir test-results
pytest --collect-only -q | grep -e "\.py" | circleci tests run --command="xargs pytest --junitxml=test-results/junit.xml -v --" --fail-fast --batch-count=3 --verbose --test-results-path="test-results"
# The pytest command sets where the fail-fast functionality is configured
- store_test_results:
path: test-results
Breakdown of the configuration
-
pytest --collect-only -q | grep -e “\.py” |
-
This piece of the command is very similar to a glob command used for CircleCI’s test splitting. It provides the list of tests to the
circleci tests run
command as standard input ( stdin). The tests in this example are all in one file for simplicity. This lets--collect-only -q
only output the test names themselves.grep
then can get all items that end in.py
. If you have multiple test files, you can instead pass filenames with a command like:pytest --collect-only -qq | cut -d ':' -f 1 |
-
A glob command is also suitable and the most common approach on CircleCI.
-
-
circleci tests run
-
This invokes the fail-fast functionality.
-
--command=”xargs pytest --junitxml=tests-results/junit.xml -v --”
-
This calls
pytest
to run the tests and tells it to output the results in JUnit format so that the results can be uploaded to CircleCI. Thexargs
flag is critical to include as it tells pytest to accept a list of tests from stdin. -
-v
is used to add verbosity in thepytest
output, not required. -
'--' is used by
xargs
to help read from stdin, not required.
-
-
--fail-fast
-
Tells CircleCI to fail-fast when it encounters a failure within a batch ( Batching explained below).
-
-
--batch-count=3
-
A mechanism to divide tests into groups where if one group finds a failed test when executing, subsequent groups will not execute, therefore failing fast. See the batching section below for details. A basic heuristic for choosing a batch count:
-
If your tests have heavy "pre-initialization logic" (like spinning up a browser/UI or creating and configuring databases), use a small batch count like 2 or 3.
-
Else: use a higher number, 5-7
-
-
-
--verbose
-
Optional, adds verbosity in output. Recommended.
-
-
–-test-results-path-”test-results”
-
Optional, but best practices for running tests on CircleCI. Needed in conjunction with the
store_test_results
command below to enable view rich test results.
-
-
Putting all the pieces together results in a job that executes the pytest
tests and will terminate and provide feedback shortly after finding the first test failure.
Verify the configuration
If the --verbose
setting is enabled, you should see in your step output a description of the number of batches processed. This is an indication that the job has been configured successfully with the fail-fast functionality.
Batching
Batches are groups of tests that have been divided, and will report status after completion. If a test failure is found when executing the group of tests in a given batch, the batch will return a failed status to CircleCI. With fail-fast enabled, once a batch has returned a failed status, CircleCI will prevent any subsequent batches from kicking off. Batch counts are set to 1
by default.
If no test splitting is enabled, batches execute sequentially as shown in the diagram shows below ("Plugin Manager" is a CircleCI component that manages state between batches).

If test splitting is enabled, each parallel run (task) splits its tests in batches and batches are executed sequentially within that task, as shown in the diagram below.

After each batch within a task finishes executing its tests, the task checks with CircleCI to see if it should keep going to the next batch. For example, if batch 1 in task 0 immediately fails its test, it will report that failure to CircleCI. After batch 1 from task 1 finishes executing, task 1 will check to see if it should go on to batch 2. Because there has already been a failure, batch 2 will not execute and the job will terminate.
Additional examples
Run jest (JavaScript/TypeScript) tests in three batches with fail-fast enabled:
npx jest --listTests | circleci tests run \
--command="xargs yarn tests" \
--batch-count=3 \
--fail-fast \
--test-results-path="test-results"
-
--listTests
grabs all tests that get fed intostdin
forxargs yarn tests
.--listTests
can sometimes be too aggressive depending on your setup. You may need to be more specific, with a regex and a glob command, to only get the intended tests. -
CircleCI will run the command
yarn tests
on the tests fed intostdin
via--listTests
. -
--batch-count=3
&--fail-fast
is enabled. For example, if any of the tests from batch 1 fail, batch 2 will not be executed.
Run Go tests with fail-fast:
go list ./... | circleci tests run
--test-results-path=./test-results.xml
--command='xargs gotestsum --junitfile ./test-results/junit.xml -- --'
--fail-fast --batch-count=2
-
go list ./…
will find and list all Go testing packages in all subdirectories to pass toxargs gotestsum
viastdin
. -
CircleCI will run the command
gotestsum
on the supplied tests. -
--batch-count=2
&--fail-fast
is enabled. If any of the test packages from batch 1 fail, batch 2 will not be executed.
Run Kaocha (Clojure) tests in six batches and fail as soon as one of the batches fails:
circleci tests run \
--command='lein kaocha $(cat test.namespaces.split | xargs -I {} echo " --focus {}")' \
--batch-count=6\
--fail-fast \
--test-results-path="test/reports" < test.namespaces
Known limitations
-
You will only get test results within the CircleCI UI for the last batch that executed. This is in the process of being resolved. This also means that test splitting by timing may not be perfect until this is resolved.
-
If you are running code coverage as part of your testing job, using this new functionality may cause code coverage reports to return unexpected results.
FAQs
Question: Are batching and parallelism the same thing?
Answer: No, see Batching section.
Question: What happens if I already have a fail-fast setting at the test runner enabled?
Answer: The test runner will honor whatever settings you give it, including options like jest’s bail. You may experience unexpected results if using a test runner’s fail-fast option in combination with the CircleCI fail-fast configuration.
Question: Does this functionality work with orbs (for example, the Cypress orb)?
Answer: We have internally tested the functionality with the Cypress orb successfully.
Question: How do I use the fail-fast functionality with CircleCI’s intelligent test splitting?
Answer: Follow the same instructions as the example above, and add append an additional parameter to your circleci tests run
command: --split-by=name
to split by filename, OR --split-by=timings
to split by timing. See the Known limitations section for constraints at this time with splitting by timing. If you are using an existing job that uses test splitting, replace that configuration with the configuration following the guidance above and the parameters described in this bullet (for example, using circleci tests run
instead of circleci tests split
).
ドキュメントの改善にご協力ください
このガイドは、CircleCI の他のドキュメントと同様にオープンソースであり、 GitHub でご利用いただけます。 ご協力いただき、ありがとうございます。
- このページの編集をご提案ください (最初に「コントリビューションガイド」をご覧ください)。
- ドキュメントの問題点を報告する、またはフィードバックやコメントを送信するには、GitHub で issue を作成してください。
- CircleCI は、ユーザーの皆様の弊社プラットフォームにおけるエクスペリエンスを向上させる方法を常に模索しています。 フィードバックをお寄せいただける場合は、リサーチコミュニティにご参加ください。
サポートが必要ですか
CircleCI のサポートエンジニアによる、サービスに関する問題、請求およびアカウントについての質問への対応、設定の構築に関する問題解決のサポートを行っています。 サポートチケットを送信して、CircleCI のサポートエンジニアにお問い合わせください。日本語でお問い合わせいただけます。
または、 サポートサイト から、サポート記事やコミュニティフォーラム、トレーニングリソースをご覧いただけます。
CircleCI Documentation by CircleCI is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.