Start Building for Free
CircleCI.comAcademyBlogCommunitySupport

Use the CircleCI CLI to split tests

1 week ago10 min read
Cloud
Server v4+
On This Page

CircleCI supports automatic test allocation across parallel compute environments. When the parallelism key in your CircleCI configuration is set to a value greater than 1, CircleCI spins up the specified number of identical execution environments in which your job is run.

Test splitting requires the CircleCI CLI together with parallelism. The CLI commands circleci tests glob and circleci tests run are used to define your test suite and allocate tests across multiple environments. The CLI is automatically injected into your job at run-time, so there is no further setup required to use the circleci tests commands.

Prerequisites

Use circleci tests run

Optimize your tests using multiple parallel-running environments. First, glob your test files, then split and run your tests.

1. Glob test files

Use circleci tests glob to define your test suite. To glob test files, pass one or more patterns to the glob command:

circleci tests glob "tests/unit/*.java" "tests/functional/*.java"

The CLI supports globbing test files using the following patterns:

  • * matches any sequence of characters (excluding path separators)

  • ** matches any sequence of characters (including path separators)

  • ? matches any single character (excluding path separators)

  • [abc] matches any character (excluding path separators) against characters in brackets

  • {foo,bar,…​} matches a sequence of characters, if any of the alternatives in braces matches

Ensure that the glob string has quotes. To check the results of pattern-matching, use the echo command.

# ~/.circleci/config.yml
version: 2.1
jobs:
  test:
    docker:
      - image: cimg/node:20.3.0
    parallelism: 4
    steps:
      - run:
          command: |
            echo $(circleci tests glob "foo/**/*" "bar/**/*")
            circleci tests glob "foo/**/*" "bar/**/*" | xargs -n 1 echo

2. Split and run tests

Two commands are available for you to split your tests. We recommend using circleci tests run as this enables you to also take advantage of the rerun failed tests feature. The examples in this section use the circleci tests run command.

Alternatively, you can see circleci tests split examples here

Test framework examples using circleci tests run

The following set of examples show how to use the circelci tests run command to split and run your tests across parallel execution environments. Using this command also allows you to use the rerun failed tests feature.

Some notes on the examples below:

  • For a full list of options when using this command, see the test splitting overview page.

  • Each example below includes a parallelism level of four, that is, four identical execution environments will be created to split your tests across. You can choose a parallelism level to suit your project.

  • Each example below shows configuing your tests to be split by timing data. This is the recommended way to split tests, but other options include splitting by name or by file size. Read more about these options in the Test splitting and parallelism overview.

  • On each successful run of a test suite, CircleCI saves timing data from the directory specified by the path in the store_test_results step. This timing data consists of how long each test took to complete per file name or class name. The available timing data will then be analyzed and your tests will be split across your parallel-running containers as evenly as possible.

  • If no timing data is found, you will receive a message: Error auto-detecting timing type, falling back to weighting by name.. The tests will then be split alphabetically by test name.

  • If you do not use store_test_results, there will be no timing data available to split your tests.

Ruby (RSpec)
  1. Add the following gem to your Gemfile:

    gem 'rspec_junit_formatter'
  2. Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

    jobs:
      build:
        docker:
          - image: cimg/ruby:3.2.2
        parallelism: 4
        resource_class: large
        steps:
          - run: mkdir ~/rspec
          - run:
              command: |
                circleci tests glob "spec/**/*_spec.rb" | circleci tests run --command="xargs bundle exec rspec --format progress --format RspecJunitFormatter -o ~/rspec/rspec.xml" --verbose --split-by=timings
    
          - store_test_results:
              path: ~/rspec
    • --format RspecJunitFormater must come after any other --format RSpec argument

    • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

    • Update the glob command to match your use case. See the RSpec section in the Collect Test Data document for details on how to output test results in an acceptable format for rspec.

Ruby (Cucumber)

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

jobs:
  build:
    docker:
      - image: cimg/ruby:3.2.2
    parallelism: 4
    resource_class: large
    steps:
      - run: mkdir -p ~/cucumber
      - run:
          command: |
          circleci tests glob "features/**/*.feature" | circleci tests run --command="xargs bundle exec cucumber --format junit,fileattribute=true --out ~/cucumber/junit.xml" --verbose --split-by=timings

      - store_test_results:
          ~/cucumber
  • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

  • Update the glob command to match your use case. See the Cucumber section in the Collect Test Data document for details on how to output test results in an acceptable format for Cucumber.

Cypress
  1. Use the cypress-circleci-reporter (this is a 3rd party tool that is not maintained by CircleCI). You can install the tool in your .circleci/config.yml or add to your package.json. Example for adding to .circleci/config.yml:

      #add required reporters (or add to package.json)
      - run:
        name: Install coverage reporter
        command: |
          npm install --save-dev cypress-circleci-reporter
  2. Use the cypress-circleci-reporter, specify parallelism, and update your test command to use circleci tests run. Then upload test results to CircleCI:

    jobs:
      build:
        docker:
          - image: cimg/base:2023.11
        parallelism: 4
        resource_class: large
        steps:
          #add required reporters (or add to package.json)
         - run:
            name: Install coverage reporter
            command: |
              npm install --save-dev cypress-circleci-reporter
         - run:
            name: run tests
            command: |
              mkdir test_results
              cd ./cypress
              npm ci
              npm run start &
              circleci tests glob "cypress/**/*.cy.js" | circleci tests run --command="xargs npx cypress run --reporter cypress-circleci-reporter --spec" --verbose --split-by=timings #--split-by=timings is optional, only use if you are using CircleCI's test splitting
    
         - store_test_results:
            path: test_results
    • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

    • Update the glob command to match your specific use case.

    • Cypress may output a warning: Warning: It looks like you’re passing --spec a space-separated list of arguments:. This can be ignored, but it can be removed by following the guidance from our community forum.

JavaScript/TypeScript (Jest)
  1. Install the jest-junit dependency. You can add this step in your .circleci/config.yml:

      - run:
          name: Install JUnit coverage reporter
          command: yarn add --dev jest-junit

    You can also add it to your jest.config.js file by following these usage instructions.

  2. Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

    jobs:
      build:
        docker:
          - image: cimg/base:2023.11
        parallelism: 4
        resource_class: large
        steps:
          - run:
              name: Install JUnit coverage reporter
              command: yarn add --dev jest-junit
          - run:
              command: |
                npx jest --listTests | circleci tests run --command="JEST_JUNIT_ADD_FILE_ATTRIBUTE=true xargs npx jest --config jest.config.js --runInBand --" --verbose --split-by=timings
              environment:
                JEST_JUNIT_OUTPUT_DIR: ./reports/
          - store_test_results:
              path: ./reports/
    • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

    • Update the npx jest --listTests command to match your use case. See the Jest section in the Collect Test Data document for details on how to output test results in an acceptable format for jest.

    • JEST_JUNIT_ADD_FILE_ATTRIBUTE=true is added to ensure that the file attribute is present. JEST_JUNIT_ADD_FILE_ATTRIBUTE=true can also be added to your jest.config.js file instead of including it in .circleci/config.yml, by using the following attribute: addFileAttribute="true".

Playwright

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

jobs:
  build:
    docker:
      - image: cimg/base:2023.11
    parallelism: 4
    resource_class: large
    steps:
      - run:
          command: |
            mkdir test-results #can also be switched out for passing PLAYWRIGHT_JUNIT_OUTPUT_NAME directly to Playwright
            pnpm run serve &
            TESTFILES=$(circleci tests glob "specs/e2e/**/*.spec.ts")
            echo "$TESTFILES" | circleci tests run --command="xargs pnpm playwright test --config=playwright.config.ci.ts --reporter=junit" --verbose --split-by=timings

      - store_test_results:
          path: results.xml
  • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

  • Update the glob command to match your use case.

  • You may also use Playwright’s built-in flag (PLAYWRIGHT_JUNIT_OUTPUT_NAME) to specify the JUnit XML output directory.

  • Ensure that you are using version 1.34.2 or later of Playwright. Earlier versions of Playwright may not output JUnit XML in a format that is compatible with this feature.

Kotlin or Gradle
  1. Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

    -run:
      command: |
        cd src/test/java
    
        # Get list of classnames of tests that should run on this node.
        circleci tests glob "**/*.java" | cut -c 1- | sed 's@/@.@g' | sed 's/.\{5\}$//' | circleci tests run --command=">classnames.txt xargs echo" --verbose --split-by=timings --timings-type=classname
    
        #if this is a re-run and it is a parallel run that does not have tests to run, halt execution of this parallel run
        [ -s classnames.txt ] || circleci-agent step halt
    -run:
      command: |
    
        # Format the arguments to "./gradlew test"
    
        GRADLE_ARGS=$(cat src/test/java/classnames.txt | awk '{for (i=1; i<=NF; i++) print "--tests",$i}')
        echo "Prepared arguments for Gradle: $GRADLE_ARGS"
    
        ./gradlew test $GRADLE_ARGS
    
    - store_test_results:
        path: build/test-results/test
  2. Update the glob command to match your use case.

Go

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

jobs:
  build:
    docker:
      - image: cimg/go:1.21.4
    parallelism: 4
    resource_class: large
    steps:
      - run:
          command: go list ./... | circleci tests run --command "xargs gotestsum --junitfile junit.xml --format testname --" --split-by=timings --timings-type=name

      - store_test_results:
          path: junit.xml
  • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

Elixir

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

jobs:
  build:
    docker:
      - image: cimg/base:2023.11
    parallelism: 4
    resource_class: large
    steps:
      - run:
          name: Run tests
          command: |
            circleci tests glob 'lib/**/*_test.exs'
            | circleci tests run --command='xargs -n1 echo > test_file_paths.txt'

            mix ecto.setup --quiet
            cat test_file_paths.txt | xargs mix test

      - store_test_results:
          path: _build/test/my_app/test-junit-report.xml
          when: always
  • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

  • Update the glob command to match your use case.

PHPUnit

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run:

# Use phpunit-finder to output list of tests to stdout for a test suite named functional
# Pass those tests as stdin to circleci tests run
jobs:
  build:
    docker:
      - image: cimg/base:2023.11
    parallelism: 4
    resource_class: large
    steps:
      - run:
          name: Run functional tests
          command: |
            TESTS_TO_RUN=$(/data/vendor/bin/phpunit-finder -- functional)
            echo "$TESTS_TO_RUN" | circleci tests run --command="xargs -I{} -d\" \" /data/vendor/bin/phpunit {} --log-junit /data/artifacts/phpunit/phpunit-functional-$(basename {}).xml" --verbose --split-by=timings

      - store_test_results:
          path: artifacts/phpunit
          when: always
  • Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

  • Note that this example uses a utility named phpunit-finder which is a third party tool that is not supported by CircleCI, use at your own risk.

Django

Modify your CircleCI configuration file to specify parallelism, and update your test command to use circleci tests run. Also, Django takes as input test filenames with a format that uses dots ("."), however, it outputs JUnit XML in a format that uses slashes "/". To account for this, get the list of test filenames first, change the filenames to be separated by dots "." instead of slashes "/", and pass the filenames into the test command.

jobs:
  build:
    docker:
      - image: cimg/python:3.12.0
    parallelism: 4
    resource_class: large
    steps:
      - run:
          name: Run tests
          command: |
            # Get the test file names, write them to files.txt, and split them by historical timing data
            circleci tests glob "**/test*.py" | circleci tests run --command=">files.txt xargs echo" --verbose --split-by=timings #split-by-timings is optional
            # Change filepaths into format Django accepts (replace slashes with dots).  Save the filenames in a TESTFILES variable
            cat files.txt | tr "/" "." | sed "s/\.py//g" | sed "s/tests\.//g" > circleci_test_files.txt
            cat circleci_test_files.txt
            TESTFILES=$(cat circleci_test_files.txt)
            # Run the tests (TESTFILES) with the reformatted test file names
            pipenv run coverage run manage.py test --parallel=8 --verbosity=2 $TESTFILES

      - store_test_results:
          path: test-results
  1. Ensure you are using xargs in your circleci tests run command to pass the list of test files/classnames via stdin to --command.

Output test files only

If your testing set-up on CircleCI is not compatible with invoking your test runner in the circleci tests run command, you can opt to use circleci tests run to receive the file names, output the file names, and save the file names to a temporary location. You can then subsequently invoke your test runner using the outputted file names.

Example:

jobs:
  build:
    docker:
      - image: cimg/base:2023.11
    parallelism: 4
    resource_class: large
    steps:
      - run:
          command: |
            circleci tests glob "src/**/*js" | circleci tests run --command=">files.txt xargs echo" --verbose --split-by=timings #split-by=timings is optional
            [ -s tmp/files.txt ] || circleci-agent step halt #if a re-run and there are no tests to re-run for this parallel run, stop execution

      - run:
          name: Run tests
          command: |
            mkdir test-results
            ... #pass files.txt into your test command

      - store_test_results:
          path: test-results

The snippet above will write the list of test file names to files.txt. On a non-rerun, this list will be all of the test file names. On a "rerun", the list will be a subset of file names (the test file names that had at least 1 test failure in the previous run). You can pass the list of test file names from files.txt into, for example, your custom makefile. If using parallelism, CircleCI spins up the same number of containers/VMs as the parallelism level that is set in .circleci/config.yml. However, not all parallel containers/VMs will execute tests. For the parallel containers/VMs that will not run tests, files.txt may not be created. The halt command ensures that in the case where a parallel run is not executing tests, the parallel run is stopped immediately.

Using circleci tests split

Optimize your tests using multiple parallel-running environments. First, glob your test files, then split test suite, then run your tests.

1. Glob test files

Use circleci tests glob to define your test suite. To glob test files, pass one or more patterns to the glob command:

circleci tests glob "tests/unit/*.java" "tests/functional/*.java"

The CLI supports globbing test files using the following patterns:

  • * matches any sequence of characters (excluding path separators)

  • ** matches any sequence of characters (including path separators)

  • ? matches any single character (excluding path separators)

  • [abc] matches any character (excluding path separators) against characters in brackets

  • {foo,bar,…​} matches a sequence of characters, if any of the alternatives in braces matches

Ensure that the glob string has quotes. To check the results of pattern-matching, use the echo command.

# ~/.circleci/config.yml
version: 2.1
jobs:
  test:
    docker:
      - image: cimg/node:20.3.0
    parallelism: 4
    steps:
      - run:
          command: |
            echo $(circleci tests glob "foo/**/*" "bar/**/*")
            circleci tests glob "foo/**/*" "bar/**/*" | xargs -n 1 echo

2. Split tests

To split your tests, pass in a list of tests to the circleci tests split command.

The following test splitting options are available:

  • Alphabetically by name (default if none specified)

  • Split using timing data --split-by=timings – We recommend this option as it results in the most even split across your parallel execution environments.

  • Split using file size --split-by=filesize

a. Split by name (default)

By default, if you do not specify a method using the --split-by flag, circleci tests run expects a list of file names or class names and splits tests alphabetically by test name. There are a few ways to provide this list:

  • Pipe a glob of test files, as demonstrated in the above section.

circleci tests glob "test/**/*.java" | circleci tests split
  • Create a text file with test filenames.

circleci tests split test_filenames.txt
  • Provide a path to the test files.

circleci tests split < /path/to/items/to/split

b. Split by timing data

The best way to optimize your test suite across a set of parallel executors is to split your tests using timing data. This will ensure the tests are split in the most even way, leading to a shorter test time.

To split by test timing, use the --split-by flag with the timings split type.

circleci tests glob "**/*.go" | circleci tests split --split-by=timings

On each successful run of a test suite, CircleCI saves timing data from the directory specified by the path in the store_test_results step. This timing data consists of how long each test took to complete per file name or class name.

The available timing data will then be analyzed and your tests will be split across your parallel-running containers as evenly as possible.

Set the timing type

The CLI attempts to auto detect the granularity of the test split (for example, whether to split by filename, or down to class name) based on the input to the split command. You may need to choose a different timing type depending on how your test coverage output is formatted, using the --timings-type option. Valid timing types are:

  • filename

  • classname

  • testname

  • autodetect

cat my_java_test_classnames | circleci tests split --split-by=timings --timings-type=classname
Set the default value for missing timing data

For partially found test results, any tests with missing data are assigned a random small value. You can override this default value with the --time-default flag:

circleci tests glob "**/*.rb" | circleci tests split --split-by=timings --time-default=10s
Download timing data

If you need to manually store and retrieve timing data, add the store_artifacts step to your job.

c. Split by file size

When provided with file paths, the CLI can also split by file size. Use the --split-by flag with the filesize split type:

circleci tests glob "**/*.go" | circleci tests split --split-by=filesize

3. Run split tests

Globbing and splitting tests does not actually run your tests. To combine test grouping with test execution, consider saving the grouped tests to a file, then passing this file to your test runner.

circleci tests glob "test/**/*.rb" | circleci tests split > /tmp/tests-to-run
bundle exec rspec $(cat /tmp/tests-to-run)

The contents of the file /tmp/tests-to-run will be different in each container, based on $CIRCLE_NODE_INDEX and $CIRCLE_NODE_TOTAL.


Suggest an edit to this page

Make a contribution
Learn how to contribute