Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 33 additions & 0 deletions .github/workflows/build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,12 @@ on:
permissions:
contents: read

# Keep these defaults in sync with contrib/setup-dashd.py
env:
DASHVERSION: "23.1.0"
TEST_DATA_REPO: "dashpay/regtest-blockchain"
TEST_DATA_VERSION: "v0.0.2"

jobs:
test:
name: ${{ matrix.group }}
Expand All @@ -41,8 +47,26 @@ jobs:
uses: taiki-e/install-action@cargo-llvm-cov

- run: pip install pyyaml

# Set up dashd and test data for groups that need it
- name: Cache dashd and test data
if: matrix.group == 'spv' || matrix.group == 'ffi'
uses: actions/cache@v4
with:
path: .rust-dashcore-test
key: rust-dashcore-test-${{ inputs.os }}-${{ env.DASHVERSION }}-${{ env.TEST_DATA_REPO }}-${{ env.TEST_DATA_VERSION }}

- name: Setup dashd for integration tests
if: matrix.group == 'spv' || matrix.group == 'ffi'
env:
CACHE_DIR: ${{ github.workspace }}/.rust-dashcore-test
shell: bash
run: python contrib/setup-dashd.py >> "$GITHUB_ENV"

- name: Run tests
id: tests
env:
DASHD_TEST_RETAIN_DIR: ${{ (matrix.group == 'spv' || matrix.group == 'ffi') && '/tmp/dashd-test-logs' || '' }}
run: >
python .github/scripts/ci_config.py run-group ${{ matrix.group }}
--os ${{ inputs.os }}
Expand All @@ -60,3 +84,12 @@ jobs:
flags: ${{ steps.tests.outputs.crate_flags }}
token: ${{ secrets.CODECOV_TOKEN }}
fail_ci_if_error: true

- name: Upload failed dashd test logs
if: failure() && (matrix.group == 'spv' || matrix.group == 'ffi')
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.group }}-test-logs-${{ inputs.os }}
path: /tmp/dashd-test-logs/
retention-days: 7
if-no-files-found: ignore
2 changes: 2 additions & 0 deletions .github/workflows/sanitizer.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ jobs:
RUSTFLAGS: "-Zsanitizer=address -Cdebuginfo=2 -Cforce-frame-pointers=yes"
ASAN_OPTIONS: "symbolize=1:allow_addr2line=1"
LSAN_OPTIONS: "fast_unwind_on_malloc=0"
SKIP_DASHD_TESTS: 1
run: |
# FFI crates (C interop)
cargo +nightly test -Zbuild-std --target x86_64-unknown-linux-gnu \
Expand All @@ -63,6 +64,7 @@ jobs:
RUST_BACKTRACE: 1
RUSTFLAGS: "-Zsanitizer=thread -Cdebuginfo=2"
TSAN_OPTIONS: "second_deadlock_stack=1"
SKIP_DASHD_TESTS: 1
run: |
# Async crate with concurrent code
cargo +nightly test -Zbuild-std --target x86_64-unknown-linux-gnu \
Expand Down
28 changes: 28 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,34 @@ DO_LINT=true ./contrib/test.sh
DO_FMT=true ./contrib/test.sh
```

### Integration Tests (dashd)

The `dash-spv` and `dash-spv-ffi` crates include integration tests that run against a real `dashd` regtest node. These tests cover SPV sync, wallet operations, restarts, disconnections, and transactions.

**Setup:** `contrib/setup-dashd.py` downloads the dashd binary and regtest blockchain test data, caching them in `~/.rust-dashcore-test/`. It outputs the required environment variables.

```bash
eval $(python3 contrib/setup-dashd.py)
```

**Running:**
```bash
cargo test -p dash-spv dashd_sync
cargo test -p dash-spv-ffi --test dashd_sync
SKIP_DASHD_TESTS=1 cargo test # skip when dashd is unavailable
```

**Debugging:**
- `DASHD_TEST_LOG=1` — enable per-test console logging (use with `--nocapture`)
- `DASHD_TEST_RETAIN_DIR=<path>` — retain test data directories on failure
- `DASHD_TEST_RETAIN_ALWAYS=1` — retain even on success

**Key files:**
- `dash-spv/tests/dashd_sync/` — test modules (basic, restart, disconnect, transaction)
- `dash-spv-ffi/tests/dashd_sync/` — FFI test modules (basic, restart, transaction, callback)
- `dash-spv/src/test_utils/` — shared infrastructure (`DashdTestContext`, `DashCoreNode`)
- `.github/ci-groups.yml` — CI test group definitions (`spv` and `ffi` groups run dashd tests)

## Development Commands

### Linting and Formatting
Expand Down
160 changes: 160 additions & 0 deletions contrib/setup-dashd.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
#!/usr/bin/env python3
"""Cross-platform setup script for dashd and test blockchain data.

Downloads the Dash Core binary and regtest test data for integration tests.
Outputs DASHD_PATH and DASHD_DATADIR lines suitable for appending to GITHUB_ENV
or evaluating in a shell.

Environment variables:
DASHVERSION - Dash Core version (default: 23.1.0)
TEST_DATA_VERSION - Test data release version (default: v0.0.2)
TEST_DATA_REPO - GitHub repo for test data (default: dashpay/regtest-blockchain)
CACHE_DIR - Cache directory (default: ~/.rust-dashcore-test)
"""

import os
import platform
import sys
import tarfile
import time
import urllib.request
import zipfile

# Keep these defaults in sync with .github/workflows/build-and-test.yml
DASHVERSION = os.environ.get("DASHVERSION", "23.1.0")
TEST_DATA_VERSION = os.environ.get("TEST_DATA_VERSION", "v0.0.2")
TEST_DATA_REPO = os.environ.get("TEST_DATA_REPO", "dashpay/regtest-blockchain")


def get_cache_dir():
if "CACHE_DIR" in os.environ:
return os.environ["CACHE_DIR"]
home = os.environ.get("HOME") or os.environ.get("USERPROFILE")
if not home:
sys.exit("Cannot determine home directory: neither HOME nor USERPROFILE is set")
return os.path.join(home, ".rust-dashcore-test")


def get_asset_info():
"""Return the asset filename for the current platform."""
system = platform.system()
machine = platform.machine()

if system == "Linux":
linux_archs = {"aarch64": "aarch64", "arm64": "aarch64", "x86_64": "x86_64", "amd64": "x86_64"}
arch = linux_archs.get(machine)
if not arch:
sys.exit(f"Unsupported Linux architecture: {machine}")
asset = f"dashcore-{DASHVERSION}-{arch}-linux-gnu.tar.gz"
elif system == "Darwin":
darwin_archs = {"arm64": "arm64", "x86_64": "x86_64"}
arch = darwin_archs.get(machine)
if not arch:
sys.exit(f"Unsupported macOS architecture: {machine}")
asset = f"dashcore-{DASHVERSION}-{arch}-apple-darwin.tar.gz"
elif system == "Windows":
asset = f"dashcore-{DASHVERSION}-win64.zip"
else:
sys.exit(f"Unsupported platform: {system}")

return asset


def log(msg):
print(msg, file=sys.stderr)


def download(url, dest, timeout=300, retries=3):
for attempt in range(1, retries + 1):
try:
log(f"Downloading {url} (attempt {attempt}/{retries})...")
with urllib.request.urlopen(url, timeout=timeout) as response:
with open(dest, "wb") as f:
while chunk := response.read(8192):
f.write(chunk)
return
except Exception as e:
log(f"Download failed: {e}")
if attempt == retries:
sys.exit(f"Failed to download {url} after {retries} attempts")
time.sleep(5 * attempt)


def extract(archive_path, dest_dir):
if archive_path.endswith(".zip"):
with zipfile.ZipFile(archive_path, "r") as zf:
zf.extractall(dest_dir)
else:
with tarfile.open(archive_path, "r:gz") as tf:
tf.extractall(dest_dir, filter="data")


def setup_dashd(cache_dir):
"""Download and extract dashd binary. Returns the path to the dashd binary."""
asset = get_asset_info()
dashd_dir = os.path.join(cache_dir, f"dashcore-{DASHVERSION}")

ext = ".exe" if platform.system() == "Windows" else ""
dashd_bin = os.path.join(dashd_dir, "bin", f"dashd{ext}")

if os.path.isfile(dashd_bin):
log(f"dashd {DASHVERSION} already available")
return dashd_bin

log(f"Downloading dashd {DASHVERSION}...")
archive_path = os.path.join(cache_dir, asset)
url = f"https://github.com/dashpay/dash/releases/download/v{DASHVERSION}/{asset}"
download(url, archive_path)
extract(archive_path, cache_dir)
os.remove(archive_path)
log(f"Downloaded dashd to {dashd_dir}")

if not os.path.isfile(dashd_bin):
sys.exit(f"Expected binary not found after extraction: {dashd_bin}")

return dashd_bin


def setup_test_data(cache_dir):
"""Download and extract test blockchain data. Returns the datadir path."""
test_data_dir = os.path.join(
cache_dir, f"regtest-blockchain-{TEST_DATA_VERSION}", "regtest-40000"
)
blocks_dir = os.path.join(test_data_dir, "regtest", "blocks")

if os.path.isdir(blocks_dir):
log(f"Test blockchain data {TEST_DATA_VERSION} already available")
return test_data_dir

log(f"Downloading test blockchain data {TEST_DATA_VERSION}...")
parent_dir = os.path.join(cache_dir, f"regtest-blockchain-{TEST_DATA_VERSION}")
os.makedirs(parent_dir, exist_ok=True)

archive_path = os.path.join(cache_dir, "regtest-40000.tar.gz")
url = f"https://github.com/{TEST_DATA_REPO}/releases/download/{TEST_DATA_VERSION}/regtest-40000.tar.gz"
download(url, archive_path)
extract(archive_path, parent_dir)
os.remove(archive_path)

if not os.path.isdir(blocks_dir):
sys.exit(f"Expected blocks directory not found after extraction: {blocks_dir}")

log(f"Downloaded test data to {test_data_dir}")

return test_data_dir


def main():
cache_dir = get_cache_dir()
os.makedirs(cache_dir, exist_ok=True)

dashd_path = setup_dashd(cache_dir)
datadir = setup_test_data(cache_dir)

# Output lines for GITHUB_ENV or shell eval
print(f"DASHD_PATH={dashd_path}")
print(f"DASHD_DATADIR={datadir}")


if __name__ == "__main__":
main()
3 changes: 2 additions & 1 deletion dash-spv-ffi/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,10 @@ rand = "0.8"
clap = { version = "4.5", features = ["derive"] }

[dev-dependencies]
tempfile = "3.8"
dash-spv = { path = "../dash-spv", features = ["test-utils"] }
serial_test = "3.0"
env_logger = "0.10"
tempfile = "3.8"

[build-dependencies]
cbindgen = "0.29"
Expand Down
Loading