-
Notifications
You must be signed in to change notification settings - Fork 855
Testing
Apache Traffic Server uses several types of automated tests. This page covers how to run them and how to write new ones.
Unit tests use the Catch2 framework and live alongside the source code they test. They test individual functions and classes in isolation.
# Run all unit tests
cd build
ctest
# Run a specific test
ctest -R test_nameAuTests (Automated Usability Tests) are the primary integration tests. They launch real ATS instances, send traffic through them, and verify behavior. Test files live in tests/gold_tests/ and are named *.test.py.
Legacy integration tests that verify specific behaviors. These are gradually being replaced by AuTests.
AuTests require a CMake option to be enabled:
cmake -B build -DENABLE_AUTEST=ON -DCMAKE_INSTALL_PREFIX=/tmp/ats-test \
-DBUILD_EXPERIMENTAL_PLUGINS=ON -DENABLE_EXAMPLE=ON
cmake --build build
cmake --install buildOr use the autest preset:
cmake --preset autest
cmake --build build-autest
cmake --install build-autestcmake --build build -t autestThis builds ATS, installs to a temporary directory, sets up the Python virtual environment, and runs all tests.
After the first full run, a helper script is generated at build/tests/autest.sh:
# Run a single test
./build/tests/autest.sh --filter=cache_generation
# Run tests matching a pattern
./build/tests/autest.sh --filter=tls_*For faster execution on multi-core machines:
python3 tests/autest-parallel.py -j 16 \
--ats-bin /tmp/ats-test/bin \
--build-root build \
--sandbox /tmp/autest-parallelKey options:
-
-j N— Number of parallel workers (default: number of CPU cores) -
--ats-bin— Path to the ATS install bin directory -
--build-root— Path to the build directory (for test plugins) -
--sandbox— Directory for test sandboxes -
-v— Verbose output with real-time progress -
--list— List all tests without running them
Tests that can't run in parallel are listed in tests/serial_tests.txt and run sequentially after the parallel phase.
Use the CI Docker images to match the CI environment exactly:
docker pull controller.trafficserver.org/ats/centos:8
docker run -it -u 1200:1200 --init --cap-add=SYS_PTRACE \
--network=host controller.trafficserver.org/ats/centos:8 /bin/bashThen build and run tests inside the container.
Unit tests use Catch2 and live in the same directory as the code they test. Add your test file to the relevant CMakeLists.txt:
#include "catch.hpp"
TEST_CASE("MyFeature", "[my_tag]")
{
REQUIRE(my_function() == expected_value);
}AuTest files go in tests/gold_tests/ in an appropriate subdirectory. Each test file is a Python script named *.test.py.
Test.Summary = "Description of what this test verifies"
Test.ContinueOnFail = True
# Create an ATS process
ts = Test.MakeATSProcess("ts")
# Create an origin server
server = Test.MakeOriginServer("server")
# Configure the origin server responses
request_header = {
"headers": "GET / HTTP/1.1\r\nHost: www.example.com\r\n\r\n",
"timestamp": "1469733493.993",
"body": ""
}
response_header = {
"headers": "HTTP/1.1 200 OK\r\nConnection: close\r\n\r\n",
"timestamp": "1469733493.993",
"body": ""
}
server.addResponse("sessionlog.json", request_header, response_header)
# Configure ATS
ts.Disk.remap_config.AddLine(
'map http://www.example.com http://127.0.0.1:{0}'.format(server.Variables.Port)
)
# Add a test run
tr = Test.AddTestRun()
tr.Processes.Default.Command = (
'curl -s -o /dev/null -w "%{{http_code}}" '
'http://127.0.0.1:{0}/'.format(ts.Variables.port)
)
tr.Processes.Default.ReturnCode = 0
tr.Processes.Default.Streams.stdout = Testers.ContainsExpression("200", "Expected 200 OK")
tr.StillRunningAfter = tsFor new tests, prefer Test.ATSReplayTest() with Proxy Verifier YAML replay files. This approach is more declarative and easier to maintain:
# replay.yaml
sessions:
- transactions:
- client-request:
method: GET
url: /path
version: '1.1'
headers:
fields:
- [Host, www.example.com]
server-response:
status: 200
headers:
fields:
- [Content-Length, '16']
content:
size: 16
proxy-response:
status: 200-
Test.MakeATSProcess(name)— Create an ATS instance -
Test.MakeOriginServer(name)— Create a mock origin server -
Test.MakeDNServer(name)— Create a mock DNS server -
Test.AddTestRun()— Add a test step -
Test.SkipUnless(Condition.HasATSFeature('feature'))— Conditionally skip tests -
Test.SkipUnless(Condition.PluginExists('plugin.so'))— Skip if plugin not built
ts = Test.MakeATSProcess("ts")
# Modify records.yaml
ts.Disk.records_config.update({
'proxy.config.http.cache.generation': -1,
'proxy.config.diags.debug.enabled': 1,
'proxy.config.diags.debug.tags': 'http',
})
# Add remap rules
ts.Disk.remap_config.AddLine(
'map / http://127.0.0.1:{0}/'.format(server.Variables.Port)
)
# Add plugin configuration
ts.Disk.plugin_config.AddLine('xdebug.so')When a test fails, examine the sandbox directory for debugging. The sandbox contains:
- ATS configuration files as generated for the test
- Log files (error.log, diags.log, squid.log)
- Process stdout/stderr output
- Any test artifacts
The sandbox location is printed in the test output. In CI, it's available as a build artifact (see Continuous Integration).
CI generates lcov coverage reports for each supported branch. To view them:
- Go to Jenkins
- Select the branch tab
- Open the
coverageproject - View "Last Successful Artifacts" →
output/index.html
- Continuous Integration — How tests run in CI
- Getting Started — Setting up your build environment
- AuTest Documentation — General AuTest framework docs
- tests/README.md — In-repo test documentation
Copyright 2025, dev@trafficserver.apache.org. Apache License, Version 2.0