Skip to content
Bryan Call edited this page Mar 19, 2026 · 2 revisions

Testing

Apache Traffic Server uses several types of automated tests. This page covers how to run them and how to write new ones.

Test Types

Catch2 Unit Tests

Unit tests use the Catch2 framework and live alongside the source code they test. They test individual functions and classes in isolation.

# Run all unit tests
cd build
ctest

# Run a specific test
ctest -R test_name

AuTest End-to-End Tests

AuTests (Automated Usability Tests) are the primary integration tests. They launch real ATS instances, send traffic through them, and verify behavior. Test files live in tests/gold_tests/ and are named *.test.py.

Regression Tests

Legacy integration tests that verify specific behaviors. These are gradually being replaced by AuTests.

Running Tests Locally

Enable AuTests in Your Build

AuTests require a CMake option to be enabled:

cmake -B build -DENABLE_AUTEST=ON -DCMAKE_INSTALL_PREFIX=/tmp/ats-test \
  -DBUILD_EXPERIMENTAL_PLUGINS=ON -DENABLE_EXAMPLE=ON
cmake --build build
cmake --install build

Or use the autest preset:

cmake --preset autest
cmake --build build-autest
cmake --install build-autest

Running All AuTests

cmake --build build -t autest

This builds ATS, installs to a temporary directory, sets up the Python virtual environment, and runs all tests.

Running Specific Tests

After the first full run, a helper script is generated at build/tests/autest.sh:

# Run a single test
./build/tests/autest.sh --filter=cache_generation

# Run tests matching a pattern
./build/tests/autest.sh --filter=tls_*

Running Tests in Parallel

For faster execution on multi-core machines:

python3 tests/autest-parallel.py -j 16 \
  --ats-bin /tmp/ats-test/bin \
  --build-root build \
  --sandbox /tmp/autest-parallel

Key options:

  • -j N — Number of parallel workers (default: number of CPU cores)
  • --ats-bin — Path to the ATS install bin directory
  • --build-root — Path to the build directory (for test plugins)
  • --sandbox — Directory for test sandboxes
  • -v — Verbose output with real-time progress
  • --list — List all tests without running them

Tests that can't run in parallel are listed in tests/serial_tests.txt and run sequentially after the parallel phase.

Running Tests in Docker

Use the CI Docker images to match the CI environment exactly:

docker pull controller.trafficserver.org/ats/centos:8
docker run -it -u 1200:1200 --init --cap-add=SYS_PTRACE \
  --network=host controller.trafficserver.org/ats/centos:8 /bin/bash

Then build and run tests inside the container.

Writing New Tests

Adding Unit Tests

Unit tests use Catch2 and live in the same directory as the code they test. Add your test file to the relevant CMakeLists.txt:

#include "catch.hpp"

TEST_CASE("MyFeature", "[my_tag]")
{
  REQUIRE(my_function() == expected_value);
}

Adding AuTests

AuTest files go in tests/gold_tests/ in an appropriate subdirectory. Each test file is a Python script named *.test.py.

Basic Test Structure

Test.Summary = "Description of what this test verifies"
Test.ContinueOnFail = True

# Create an ATS process
ts = Test.MakeATSProcess("ts")

# Create an origin server
server = Test.MakeOriginServer("server")

# Configure the origin server responses
request_header = {
    "headers": "GET / HTTP/1.1\r\nHost: www.example.com\r\n\r\n",
    "timestamp": "1469733493.993",
    "body": ""
}
response_header = {
    "headers": "HTTP/1.1 200 OK\r\nConnection: close\r\n\r\n",
    "timestamp": "1469733493.993",
    "body": ""
}
server.addResponse("sessionlog.json", request_header, response_header)

# Configure ATS
ts.Disk.remap_config.AddLine(
    'map http://www.example.com http://127.0.0.1:{0}'.format(server.Variables.Port)
)

# Add a test run
tr = Test.AddTestRun()
tr.Processes.Default.Command = (
    'curl -s -o /dev/null -w "%{{http_code}}" '
    'http://127.0.0.1:{0}/'.format(ts.Variables.port)
)
tr.Processes.Default.ReturnCode = 0
tr.Processes.Default.Streams.stdout = Testers.ContainsExpression("200", "Expected 200 OK")
tr.StillRunningAfter = ts

Using Proxy Verifier (Preferred for New Tests)

For new tests, prefer Test.ATSReplayTest() with Proxy Verifier YAML replay files. This approach is more declarative and easier to maintain:

# replay.yaml
sessions:
- transactions:
  - client-request:
      method: GET
      url: /path
      version: '1.1'
      headers:
        fields:
        - [Host, www.example.com]
    server-response:
      status: 200
      headers:
        fields:
        - [Content-Length, '16']
      content:
        size: 16
    proxy-response:
      status: 200

Useful AuTest APIs

  • Test.MakeATSProcess(name) — Create an ATS instance
  • Test.MakeOriginServer(name) — Create a mock origin server
  • Test.MakeDNServer(name) — Create a mock DNS server
  • Test.AddTestRun() — Add a test step
  • Test.SkipUnless(Condition.HasATSFeature('feature')) — Conditionally skip tests
  • Test.SkipUnless(Condition.PluginExists('plugin.so')) — Skip if plugin not built

Configuration in Tests

ts = Test.MakeATSProcess("ts")

# Modify records.yaml
ts.Disk.records_config.update({
    'proxy.config.http.cache.generation': -1,
    'proxy.config.diags.debug.enabled': 1,
    'proxy.config.diags.debug.tags': 'http',
})

# Add remap rules
ts.Disk.remap_config.AddLine(
    'map / http://127.0.0.1:{0}/'.format(server.Variables.Port)
)

# Add plugin configuration
ts.Disk.plugin_config.AddLine('xdebug.so')

AuTest Sandbox Structure

When a test fails, examine the sandbox directory for debugging. The sandbox contains:

  • ATS configuration files as generated for the test
  • Log files (error.log, diags.log, squid.log)
  • Process stdout/stderr output
  • Any test artifacts

The sandbox location is printed in the test output. In CI, it's available as a build artifact (see Continuous Integration).

Test Coverage

CI generates lcov coverage reports for each supported branch. To view them:

  1. Go to Jenkins
  2. Select the branch tab
  3. Open the coverage project
  4. View "Last Successful Artifacts" → output/index.html

See Also

Clone this wiki locally