Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
95 commits
Select commit Hold shift + click to select a range
16ddb98
feat: Add virtual environment support for simple_text_node
pollockjj Nov 14, 2025
aee01de
feat: Add host path preservation for spawn context
pollockjj Nov 14, 2025
4027e0c
fix: Add torch dev/nightly build support
pollockjj Nov 14, 2025
58ba87d
feat: Update logging emojis to enhance clarity and add new tests for …
pollockjj Nov 14, 2025
7f2b455
refactor: Update logging messages to use consistent book emoji for cl…
pollockjj Nov 14, 2025
13e5c72
Merge remote changes, resolve emoji conflicts (keep book emoji)
pollockjj Nov 14, 2025
f045210
feat: Enhance documentation and setup guide; improve error handling a…
pollockjj Nov 16, 2025
458d11a
Enhance logging for path unification and dependency updates; prevent …
pollockjj Nov 21, 2025
846e4cc
feat: Enhance RPC handling with callback support; improve logging and…
pollockjj Nov 22, 2025
ef0a5d7
Add timing and logging for extension loading; ensure 'uv' installatio…
pollockjj Nov 25, 2025
9f97425
refactor: Change log level to debug for cached environment fingerprin…
pollockjj Nov 25, 2025
1a6ef9b
refactor: Remove timing logs for extension launch process
pollockjj Nov 25, 2025
a54aa19
feat: Implement tensor conversion functions for safe IPC serializatio…
pollockjj Nov 27, 2025
d785fa3
interim work - Add global RPC instance management for child processes…
pollockjj Nov 27, 2025
4baa626
feat: Enhance virtual environment creation and dependency installatio…
pollockjj Nov 27, 2025
8699bfa
fix: Update path handling in build_child_sys_path to clarify comfy_ex…
pollockjj Nov 27, 2025
e2ad9e7
Added logging initialization to host logging level ('INFO') for usage…
pollockjj Nov 30, 2025
cd07469
CLIP Stateless RPC Proxy with automatic serialization
pollockjj Nov 30, 2025
8d83a4b
feat: implement .pth-based venv inheritance for torch sharing
pollockjj Dec 1, 2025
716f728
feat: enhance serialization for ModelPatcher and ModelPatcherProxy; s…
pollockjj Dec 1, 2025
dbca9bb
feat(windows): Add Windows compatibility via Manager-based IPC queues
pollockjj Dec 1, 2025
28e7018
feat(serialization): enhance tensor serialization to handle custom di…
pollockjj Dec 1, 2025
e0540f8
refactor(logging): streamline logging in Extension class and remove r…
pollockjj Dec 2, 2025
fcb2489
feat(logging): add filter to suppress noisy ComfyUI startup logs and …
pollockjj Dec 2, 2025
b4cdd9a
feat(process): implement lazy initialization for isolated processes a…
pollockjj Dec 2, 2025
99e742e
feat(config): implement dynamic discovery of torch ecosystem packages…
pollockjj Dec 2, 2025
465c72b
fixing unit tests
pollockjj Dec 3, 2025
029ae2c
docs: update README and client module with environment variable detai…
pollockjj Dec 3, 2025
2b41358
feat(tools): add CLI tool for route extraction and verification
pollockjj Dec 3, 2025
447a0a1
fix(logging): prevent duplicate log prefixes in child process by remo…
pollockjj Dec 3, 2025
1b123f6
fix(multiprocessing): improve process management for cross-platform c…
pollockjj Dec 5, 2025
4875936
adhoc v3 work
pollockjj Dec 5, 2025
809814d
feat(logging): implement logging queue for isolated processes and imp…
pollockjj Dec 7, 2025
04a14e5
fix(extension): preparing for release - pyisolate moved to a standard…
pollockjj Dec 7, 2025
77f7de7
Debug and development documentation and logging clean-up. Clarified a…
pollockjj Dec 7, 2025
03b087b
Remove ComfyUI Hello World example and associated files
pollockjj Dec 7, 2025
0e7b84c
feat(deserialization): enhance async handling in deserialize_from_iso…
pollockjj Dec 9, 2025
1f7e4c9
feat(serialization): enhance serialization and deserialization for Re…
pollockjj Dec 9, 2025
3f8bf6c
feat(logging): implement deduplication filter for log messages and su…
pollockjj Dec 9, 2025
257244c
fix(client): ensure logging handlers are removed only when both envir…
pollockjj Dec 9, 2025
619ce36
feat(serialization): add support for returning reference dicts for ex…
pollockjj Dec 10, 2025
3a8f05e
restore original documentation and ensure documentation meets generic…
pollockjj Dec 10, 2025
1e05fa7
fix(client): rename comfy_root to preferred_root for clarity in path …
pollockjj Dec 10, 2025
1f4214b
feat(container): enhance AttrDict and AttributeContainer with additio…
pollockjj Dec 10, 2025
09613c7
feat(serialization): enhance serialization for ModelPatcher and Model…
pollockjj Dec 10, 2025
c878955
feat(cuda): add support for CUDA IPC in tensor transport and serializ…
pollockjj Dec 10, 2025
2ec9e0e
fix(serialization): remove unnecessary logging in serialize_for_isola…
pollockjj Dec 10, 2025
7fd0267
feat(documentation): enhance README with detailed ComfyUI V3 API supp…
pollockjj Dec 10, 2025
2c55eb7
refactor: Address 38 PR review comments + 96% test coverage + adapter…
pollockjj Dec 16, 2025
2f23230
PR3 Changes
pollockjj Dec 16, 2025
7433c94
PR3 updates - additional type hints to use built-in types and improve…
pollockjj Dec 16, 2025
50a9557
feat: pyisolate decoupling from ComfyUI - part 2 (complete)
pollockjj Dec 17, 2025
93c6936
refactor(tests): replace initial basic tests with contract-based test…
pollockjj Dec 20, 2025
9f4ef60
fix: RPC event loop stale reference on serial workflow execution
pollockjj Dec 21, 2025
48e313f
Replace pickle with JSON-RPC over Unix Domain Sockets
pollockjj Dec 22, 2025
3b47c9e
fix: prevent host serialization corruption from PYISOLATE_CHILD env var
pollockjj Dec 26, 2025
47a6ee0
feat: auto-populate RPC services from adapter in Extension class
pollockjj Dec 27, 2025
aeb4698
fix: change logging level to debug for existing serializer overwrite
pollockjj Dec 27, 2025
a7a5a65
Implemented `CallableProxy` in `pyisolate.shared` to preserve signatu…
pollockjj Dec 29, 2025
80ae924
feat: add generic registry lookup for deserialization in JSONSocketTr…
pollockjj Dec 29, 2025
51dd2f8
fix: switch to untyped storage, removed unnecessary whitespace in sha…
pollockjj Dec 29, 2025
8ca943d
fix: general log spam clean-up
pollockjj Dec 29, 2025
39fab22
fix: update logging format in uds_client and improve tensor serializa…
pollockjj Dec 30, 2025
84ecea1
refactor: Remove `_tensor_to_cuda` and associated error handling from…
pollockjj Dec 30, 2025
04efa92
Refactor shared.py into rpc_protocol, rpc_transprorts, and rpc_serial…
pollockjj Dec 30, 2025
09d7278
refactoring and linting clean-up
pollockjj Dec 30, 2025
937a9f2
removing debug logging
pollockjj Dec 30, 2025
02feed0
refactor: reorganize imports and remove unused test cases
pollockjj Dec 30, 2025
2c239a2
refactor: implement adapter registry for isolation adapters and updat…
pollockjj Dec 30, 2025
d688f46
fix ruff and mypy linting errors and type annotations
pollockjj Dec 30, 2025
a50a329
feat: implement graceful shutdown for RPC and add corresponding tests
pollockjj Dec 30, 2025
0af7562
refactor: remove unnecessary logging in bootstrap and host modules
pollockjj Dec 31, 2025
74a8fcb
feat: Re-implemented full sandboxing for linux isolated children
pollockjj Dec 31, 2025
9d6fd09
feat: dynamically check `PYISOLATE_ENABLE_CUDA_IPC` for runtime CUDA …
pollockjj Jan 3, 2026
d5e344d
Fixing linting errors, writing integration tests that match current c…
pollockjj Jan 5, 2026
f8a71b8
refactor: adjust tensor keeper logging to debug
pollockjj Jan 5, 2026
83fb88e
chore: update project configuration and dependencies
pollockjj Jan 5, 2026
1a149c1
feat: allow configurable network access for sandbox by conditionally …
pollockjj Jan 6, 2026
cc269ea
fix: Implement eager serialization of RPC arguments to prevent race c…
pollockjj Jan 7, 2026
2c42a56
refactor: return plain dictionary instead of AttrDict after tensor co…
pollockjj Jan 7, 2026
c50377a
Fix for issue uncovered during LayerStyle investigation
pollockjj Jan 7, 2026
cec8eca
general commente clean-up
pollockjj Jan 7, 2026
ec7ef92
fix linting issues
pollockjj Jan 9, 2026
d52a779
correcting allowed characters based on .toml spec
pollockjj Jan 9, 2026
4492d7c
Merge branch 'main' of https://github.com/pollockjj/pyisolate
pollockjj Jan 9, 2026
30dffd5
Refactor IPC handling for Windows and add socket utilities
pollockjj Jan 10, 2026
8b2265b
Fix site-packages path detection in create_venv for Windows
pollockjj Jan 10, 2026
8246ea4
Pure clean-up of auxiliary library files and minor aligning with exis…
pollockjj Jan 11, 2026
9b06722
Relocate rpc.run() call in async UDS entrypoint
pollockjj Jan 20, 2026
f11b651
fix: CI compatibility - ruff format, Python 3.10+, bubblewrap
pollockjj Jan 21, 2026
6afd232
feat: add SandboxMode config (REQUIRED/DISABLED) with security warning
pollockjj Jan 21, 2026
dfe4f23
documentation and testing improvements and minor refactoring to support
pollockjj Jan 22, 2026
54ec576
ci: add build-wheels workflow with guarded PyPI publish
pollockjj Feb 13, 2026
6353304
feat: make torch optional at import boundary with explicit runtime ga…
pollockjj Feb 13, 2026
41e5778
test: stabilize lifecycle provisioning with shared uv cache and timeo…
pollockjj Feb 13, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
[run]
parallel = True
sigterm = True
source = .
omit =
tests/*
.venv/*
branch = True

[report]
show_missing = True
skip_covered = False
152 changes: 152 additions & 0 deletions .github/workflows/build-wheels.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,152 @@
name: Build Wheels

on:
pull_request:
push:
branches: [main]
workflow_dispatch:
release:
types: [published]

permissions:
contents: read

concurrency:
group: build-wheels-${{ github.workflow }}-${{ github.ref || github.run_id }}
cancel-in-progress: false
Comment on lines +14 to +16
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider enabling cancel-in-progress for PR builds.

With cancel-in-progress: false, multiple pushes to the same PR branch will queue builds instead of canceling stale runs. This can waste CI minutes during active development. Consider splitting concurrency or using a conditional:

♻️ Suggested improvement
 concurrency:
   group: build-wheels-${{ github.workflow }}-${{ github.ref || github.run_id }}
-  cancel-in-progress: false
+  cancel-in-progress: ${{ github.event_name == 'pull_request' }}

This preserves non-canceling behavior for main pushes and releases while canceling superseded PR runs.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
concurrency:
group: build-wheels-${{ github.workflow }}-${{ github.ref || github.run_id }}
cancel-in-progress: false
concurrency:
group: build-wheels-${{ github.workflow }}-${{ github.ref || github.run_id }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 14 - 16, The concurrency
block currently sets cancel-in-progress: false which queues redundant runs;
change it to cancel only for PR builds by making cancel-in-progress conditional
on the event (e.g. use github.event_name or github.ref to detect pull_request)
so PR pushes cancel superseded runs while main/releases keep non-canceling
behavior—update the concurrency attributes (group, cancel-in-progress) in the
build-wheels workflow using the conditional expression involving
github.event_name (or github.ref) to toggle cancel-in-progress appropriately.


jobs:
build:
name: Build ${{ matrix.os }} py${{ matrix.python-version }}
runs-on: ${{ matrix.os }}
timeout-minutes: 20
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
python-version: ["3.10", "3.11", "3.12"]
Comment on lines +23 to +27
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Get the repository structure overview
echo "=== Repository structure ==="
git ls-files | head -30

echo -e "\n=== Looking for setup.py, setup.cfg, pyproject.toml ==="
fd -t f -e py -g 'setup.py' -o -g 'setup.cfg' -o -g 'pyproject.toml'

echo -e "\n=== Checking for C/C++ source files ==="
fd -e c -e cpp -e h -e pyx | head -20

Repository: Comfy-Org/pyisolate

Length of output: 1006


🏁 Script executed:

#!/bin/bash
echo "=== Looking for pyproject.toml ==="
find . -maxdepth 2 -name 'pyproject.toml' -type f

echo -e "\n=== Looking for setup.py or setup.cfg ==="
find . -maxdepth 2 -name 'setup.py' -o -name 'setup.cfg' -type f

echo -e "\n=== Checking for C/C++ files ==="
find . -type f \( -name '*.c' -o -name '*.cpp' -o -name '*.h' -o -name '*.pyx' -o -name '*.pxd' \)

echo -e "\n=== Looking for build-related files ==="
find . -maxdepth 2 -name 'build' -o -name 'CMakeLists.txt' -o -name 'Makefile'

echo -e "\n=== Checking MANIFEST.in ==="
cat MANIFEST.in

Repository: Comfy-Org/pyisolate

Length of output: 469


🏁 Script executed:

#!/bin/bash
echo "=== pyproject.toml ==="
cat pyproject.toml

echo -e "\n=== setup.py ==="
cat setup.py

Repository: Comfy-Org/pyisolate

Length of output: 4279


🏁 Script executed:

#!/bin/bash
echo "=== build-wheels.yml workflow ==="
cat .github/workflows/build-wheels.yml

Repository: Comfy-Org/pyisolate

Length of output: 4378


Redundant cross-platform builds for a pure Python package.

Pyisolate has no C extensions and uses a standard setuptools build backend, which means python -m build produces identical py3-none-any wheels on every platform. Running 9 matrix legs that generate identical artifacts wastes CI minutes. The publish job even acknowledges this by deduplicating "byte-identical files produced in multiple matrix legs." Consider building on a single OS and running platform-specific test jobs separately if cross-platform validation is needed.

Additionally, Python 3.13 (released Oct 2024) should be added to the test matrix if the project supports it, as the requires-python = ">=3.10" constraint permits it.

🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 23 - 27, The workflow is
building redundant platform-identical wheels: `strategy.matrix.os` currently
lists ubuntu/macos/windows but `python -m build` produces `py3-none-any` wheels
for this pure-Python package, so reduce the build matrix to a single OS (e.g.,
keep `ubuntu-latest`) in the `strategy.matrix.os` for the wheel-building job and
run platform-specific test jobs separately instead of rebuilding artifacts; also
extend `strategy.matrix.python-version` to include "3.13" (in the same array as
"3.10","3.11","3.12") so the build/test matrix covers the newer Python release,
and ensure the `publish` job still handles deduplication as before.


steps:
- name: Checkout
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install build tooling
run: |
python -m pip install --upgrade pip
python -m pip install build

- name: Build wheel and sdist
run: python -m build

- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: dist-${{ matrix.os }}-py${{ matrix.python-version }}
path: dist/*
if-no-files-found: error
retention-days: 7

smoke-test:
name: Smoke Test Built Artifacts
runs-on: ubuntu-latest
timeout-minutes: 15
needs: [build]

steps:
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.11"

- name: Download artifacts
uses: actions/download-artifact@v4
with:
name: dist-ubuntu-latest-py3.11
path: artifacts

- name: Install wheel and run smoke test
shell: bash
run: |
python -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip

wheel_path="$(find artifacts -type f -name '*.whl' | head -n 1)"
if [ -z "$wheel_path" ]; then
echo "No wheel artifact found."
exit 1
fi
python -m pip install "$wheel_path"

sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
if [ -z "$sdist_path" ]; then
echo "No sdist artifact found."
exit 1
fi

temp_dir="$(mktemp -d)"
cd "$temp_dir"
python -c "import pyisolate; print(pyisolate.__version__)"
Comment on lines +86 to +94
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

sdist is located but never tested.

Lines 86-90 find the sdist and fail if absent, but the path is never used — the sdist is neither installed nor verified. Either install it in a separate venv to confirm it builds/imports correctly, or remove this dead check (the publish job already validates sdist presence).

♻️ Option A: Remove dead check
          python -m pip install "$wheel_path"
 
-          sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
-          if [ -z "$sdist_path" ]; then
-            echo "No sdist artifact found."
-            exit 1
-          fi
-
           temp_dir="$(mktemp -d)"
           cd "$temp_dir"
           python -c "import pyisolate; print(pyisolate.__version__)"
♻️ Option B: Actually test the sdist
          sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
          if [ -z "$sdist_path" ]; then
            echo "No sdist artifact found."
            exit 1
          fi
 
          temp_dir="$(mktemp -d)"
          cd "$temp_dir"
          python -c "import pyisolate; print(pyisolate.__version__)"
+
+         # Verify sdist builds and installs correctly
+         python -m venv .venv-sdist
+         source .venv-sdist/bin/activate
+         python -m pip install --upgrade pip
+         python -m pip install "$OLDPWD/$sdist_path"
+         python -c "import pyisolate; print(pyisolate.__version__)"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
if [ -z "$sdist_path" ]; then
echo "No sdist artifact found."
exit 1
fi
temp_dir="$(mktemp -d)"
cd "$temp_dir"
python -c "import pyisolate; print(pyisolate.__version__)"
python -m pip install "$wheel_path"
temp_dir="$(mktemp -d)"
cd "$temp_dir"
python -c "import pyisolate; print(pyisolate.__version__)"
Suggested change
sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
if [ -z "$sdist_path" ]; then
echo "No sdist artifact found."
exit 1
fi
temp_dir="$(mktemp -d)"
cd "$temp_dir"
python -c "import pyisolate; print(pyisolate.__version__)"
sdist_path="$(find artifacts -type f -name '*.tar.gz' | head -n 1)"
if [ -z "$sdist_path" ]; then
echo "No sdist artifact found."
exit 1
fi
temp_dir="$(mktemp -d)"
cd "$temp_dir"
python -c "import pyisolate; print(pyisolate.__version__)"
# Verify sdist builds and installs correctly
python -m venv .venv-sdist
source .venv-sdist/bin/activate
python -m pip install --upgrade pip
python -m pip install "$OLDPWD/$sdist_path"
python -c "import pyisolate; print(pyisolate.__version__)"
🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 86 - 94, The workflow
currently finds an sdist into sdist_path but never installs or validates it;
update the job to actually test the sdist by creating and activating an isolated
venv in temp_dir, installing the found tarball ($sdist_path) into that venv
(e.g., python -m pip install "$sdist_path"), then run an import/test command
(replace the current python -c "import pyisolate; print(pyisolate.__version__")
inside that venv to verify the package imports; ensure you still fail the step
on install/import errors and remove the dead-only check if you prefer the
alternate option.


publish:
name: Publish To PyPI (Trusted Publishing)
runs-on: ubuntu-latest
timeout-minutes: 10
needs: [build, smoke-test]
if: >-
github.repository == 'Comfy-Org/pyisolate' &&
github.event_name == 'release' &&
github.event.action == 'published' &&
github.event.release.tag_name != '' &&
startsWith(github.event.release.tag_name, 'v')
permissions:
id-token: write
contents: read
concurrency:
group: publish-pypi-${{ github.event.release.tag_name }}
cancel-in-progress: false

steps:
- name: Download artifacts
uses: actions/download-artifact@v4
with:
pattern: dist-*
merge-multiple: false
path: downloaded

- name: Collect distributions
shell: bash
run: |
mkdir -p dist
find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0 | while IFS= read -r -d '' src; do
base="$(basename "$src")"
dest="dist/$base"
if [ -e "$dest" ]; then
# Deduplicate byte-identical files produced in multiple matrix legs.
if cmp -s "$src" "$dest"; then
continue
fi
echo "Conflicting distribution filename with different content: $base"
exit 1
fi
cp "$src" "$dest"
done

wheel_count="$(find dist -maxdepth 1 -type f -name '*.whl' | wc -l)"
sdist_count="$(find dist -maxdepth 1 -type f -name '*.tar.gz' | wc -l)"
if [ "$wheel_count" -eq 0 ] || [ "$sdist_count" -eq 0 ]; then
echo "Expected at least one wheel and one sdist for publish."
exit 1
fi

ls -l dist

Comment on lines +122 to +148
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Correction to previous comment: GitHub Actions' bash shell prepends set -eo pipefail by default, so the exit 1 inside the piped while loop will propagate in this context. However, this reliance on implicit shell options is fragile — using process substitution makes the intent explicit and portable:

♻️ More robust alternative using process substitution
-          find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0 | while IFS= read -r -d '' src; do
+          while IFS= read -r -d '' src; do
             base="$(basename "$src")"
             dest="dist/$base"
             if [ -e "$dest" ]; then
               if cmp -s "$src" "$dest"; then
                 continue
               fi
               echo "Conflicting distribution filename with different content: $base"
               exit 1
             fi
             cp "$src" "$dest"
-          done
+          done < <(find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Collect distributions
shell: bash
run: |
mkdir -p dist
find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0 | while IFS= read -r -d '' src; do
base="$(basename "$src")"
dest="dist/$base"
if [ -e "$dest" ]; then
# Deduplicate byte-identical files produced in multiple matrix legs.
if cmp -s "$src" "$dest"; then
continue
fi
echo "Conflicting distribution filename with different content: $base"
exit 1
fi
cp "$src" "$dest"
done
wheel_count="$(find dist -maxdepth 1 -type f -name '*.whl' | wc -l)"
sdist_count="$(find dist -maxdepth 1 -type f -name '*.tar.gz' | wc -l)"
if [ "$wheel_count" -eq 0 ] || [ "$sdist_count" -eq 0 ]; then
echo "Expected at least one wheel and one sdist for publish."
exit 1
fi
ls -l dist
- name: Collect distributions
shell: bash
run: |
mkdir -p dist
while IFS= read -r -d '' src; do
base="$(basename "$src")"
dest="dist/$base"
if [ -e "$dest" ]; then
# Deduplicate byte-identical files produced in multiple matrix legs.
if cmp -s "$src" "$dest"; then
continue
fi
echo "Conflicting distribution filename with different content: $base"
exit 1
fi
cp "$src" "$dest"
done < <(find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0)
wheel_count="$(find dist -maxdepth 1 -type f -name '*.whl' | wc -l)"
sdist_count="$(find dist -maxdepth 1 -type f -name '*.tar.gz' | wc -l)"
if [ "$wheel_count" -eq 0 ] || [ "$sdist_count" -eq 0 ]; then
echo "Expected at least one wheel and one sdist for publish."
exit 1
fi
ls -l dist
🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 122 - 148, The piped find |
while IFS= read -r -d '' src; do ...; done relies on implicit shell options for
error propagation and can be fragile; replace the pipeline with process
substitution so the loop reads from < <(find downloaded -type f \( -name "*.whl"
-o -name "*.tar.gz" \) -print0) (the step named "Collect distributions" and the
while loop reading "src") to ensure any exit 1 inside the loop properly
terminates the job and to make the intent explicit and portable across shells.

⚠️ Potential issue | 🔴 Critical

exit 1 inside piped while loop won't fail the step.

The find ... | while read ... construct runs the while body in a subshell (in bash). The exit 1 on line 135 exits only that subshell — the outer script continues, potentially publishing conflicting distributions. Use process substitution or set -o pipefail to propagate the failure:

🐛 Proposed fix using process substitution
       - name: Collect distributions
         shell: bash
         run: |
+          set -euo pipefail
           mkdir -p dist
-          find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0 | while IFS= read -r -d '' src; do
+          while IFS= read -r -d '' src; do
             base="$(basename "$src")"
             dest="dist/$base"
             if [ -e "$dest" ]; then
               # Deduplicate byte-identical files produced in multiple matrix legs.
               if cmp -s "$src" "$dest"; then
                 continue
               fi
               echo "Conflicting distribution filename with different content: $base"
               exit 1
             fi
             cp "$src" "$dest"
-          done
+          done < <(find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \) -print0)
 
           wheel_count="$(find dist -maxdepth 1 -type f -name '*.whl' | wc -l)"
🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 122 - 148, The failure
inside the piped loop in the "Collect distributions" step (the find ... | while
IFS= read -r -d '' src; do ... exit 1 ... done) only exits the subshell so the
job can continue; change the pipeline so the loop runs in the main shell (e.g.
replace the pipe with process substitution: while IFS= read -r -d '' src; do ...
done < <(find downloaded -type f \( -name "*.whl" -o -name "*.tar.gz" \)
-print0)) or enable pipefail at the top of the run (set -o pipefail) so the
outer step fails when that exit 1 is hit; ensure the existing exit 1 remains to
stop the step on conflicts.

- name: Publish to PyPI via OIDC
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: dist
Comment on lines +149 to +152
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

pypa gh-action-pypi-publish release/v1 commit SHA latest 2025

💡 Result:

For pypa/gh-action-pypi-publish, the latest 2025 release was v1.13.0 (released Sep 4, 2025) [1]. That release corresponds to commit ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e [2].

So, pypa/gh-action-pypi-publish@release/v1 (latest in 2025) ⇒ ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e. [1][2]

Sources: [1] [2]


🌐 Web query:

GitHub Actions best practices pinning commit SHA branch reference security

💡 Result:

Why not use branches/tags (e.g., @main, @v3) for uses:

  • Branches and movable tags can be updated to point to new code—so your workflow may start running different (potentially malicious) code without any change in your repo. GitHub recommends pinning to a full-length commit SHA as the most reliable immutable reference. [1][2]

Best practice: pin every third‑party action (and reusable workflow) to a full commit SHA

Use:

- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7

Key points:

  • Use the full 40-char SHA (not a short SHA). [1][2]
  • Prefer a SHA that corresponds to an upstream tagged release, and note the human-readable version in a comment (helps audits and upgrades). [2]
  • Verify the SHA is from the action’s official repository (not a fork). [1]

Keep pinned SHAs updated safely

  • Enable Dependabot version updates for GitHub Actions so you still get automated upgrade PRs even when pinned to SHAs (common pattern is “pin to SHA + Dependabot updates”). [2]

Enforce it so PRs can’t add unpinned uses:

  • GitHub Actions policies can require SHA pinning, causing workflows that use unpinned actions to fail. This can be enforced at enterprise/org/repo level via the “allowed actions and reusable workflows” policy. [2][3]

Optional helpers (repo-level checks)

  • Use a “SHA pinning” checker action in CI to detect any uses: entries that aren’t pinned to full SHAs. [4]

Sources: GitHub docs on security hardening/pinning [1], GitHub Well-Architected guidance [2], GitHub Changelog on SHA-pinning enforcement policy [3], “Ensure SHA Pinned Actions” marketplace tool [4].


🏁 Script executed:

# Check if we can find any documentation or references within the repo about action pinning
git ls-files | head -20

Repository: Comfy-Org/pyisolate

Length of output: 515


Pin gh-action-pypi-publish to a commit SHA for supply-chain safety.

Using @release/v1 (a branch ref) means the action's contents can change without any update to this workflow. For a publishing step with id-token: write permissions, this is a security-sensitive path. Pin to the current release commit SHA:

🔒 Suggested fix
       - name: Publish to PyPI via OIDC
-        uses: pypa/gh-action-pypi-publish@release/v1
+        uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e  # v1.13.0
         with:
           packages-dir: dist

The same recommendation applies to actions/checkout@v4, actions/setup-python@v5, actions/upload-artifact@v4, and actions/download-artifact@v4 — though the publish action is most critical given its elevated permissions.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Publish to PyPI via OIDC
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: dist
- name: Publish to PyPI via OIDC
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
packages-dir: dist
🤖 Prompt for AI Agents
In @.github/workflows/build-wheels.yml around lines 149 - 152, The workflow
currently uses the mutable ref "pypa/gh-action-pypi-publish@release/v1" for the
Publish to PyPI via OIDC step; replace that ref with the specific commit SHA for
the current release to pin the action (use the commit SHA from the action's
repository release you intend to use) and update any other high-risk actions
(actions/checkout@v4, actions/setup-python@v5, actions/upload-artifact@v4,
actions/download-artifact@v4) to their respective commit SHAs as well so the
publish step (and other artifact steps) run a fixed, auditable version instead
of a branch/tag ref.

22 changes: 13 additions & 9 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.10', '3.11', '3.12']
python-version: ['3.10', '3.11', '3.12']
os: [ubuntu-latest, ubuntu-22.04, ubuntu-24.04]

steps:
Expand All @@ -25,6 +25,9 @@ jobs:
with:
python-version: ${{ matrix.python-version }}

- name: Install bubblewrap
run: sudo apt-get update && sudo apt-get install -y bubblewrap

- name: Install uv
uses: astral-sh/setup-uv@v3

Expand Down Expand Up @@ -56,23 +59,23 @@ jobs:
include:
- container: debian:11
python-install: |
apt-get update && apt-get install -y python3 python3-pip python3-venv git curl
apt-get update && apt-get install -y python3 python3-pip python3-venv git curl bubblewrap
extras: "dev,test"
- container: debian:12
python-install: |
apt-get update && apt-get install -y python3 python3-pip python3-venv git curl
apt-get update && apt-get install -y python3 python3-pip python3-venv git curl bubblewrap
extras: "dev,test"
- container: fedora:38
python-install: |
dnf install -y python3 python3-pip git curl
dnf install -y python3 python3-pip git curl bubblewrap
extras: "dev,test"
- container: fedora:39
python-install: |
dnf install -y python3 python3-pip git curl
dnf install -y python3 python3-pip git curl bubblewrap
extras: "dev,test"
- container: rockylinux:9
python-install: |
dnf install -y python3 python3-pip git
dnf install -y python3 python3-pip git bubblewrap
extras: "dev,test"

container: ${{ matrix.container }}
Expand Down Expand Up @@ -130,6 +133,7 @@ jobs:
ruff check pyisolate tests
ruff format --check pyisolate tests

# - name: Run mypy
# run: |
# mypy pyisolate
- name: Run mypy
run: |
source .venv/bin/activate
mypy pyisolate
9 changes: 6 additions & 3 deletions .github/workflows/pytorch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.11']
python-version: ['3.10', '3.11']
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if Python version requirements are documented in pyproject.toml or setup.py
echo "=== Checking pyproject.toml for python-requires ==="
if fd -t f "pyproject.toml" --max-one-result; then
  echo "Found pyproject.toml, checking for python requirement..."
  fd -t f "pyproject.toml" --exec cat {} \; | grep -E "(requires-python|python)" || echo "No explicit python requirement found in pyproject.toml"
fi

echo ""
echo "=== Checking setup.py for python_requires ==="
if fd -t f "setup.py" --max-one-result; then
  echo "Found setup.py, checking for python_requires..."
  fd -t f "setup.py" --exec cat {} \; | grep -i "python_requires" || echo "No python_requires found in setup.py"
fi

echo ""
echo "=== Checking setup.cfg for python version constraints ==="
if fd -t f "setup.cfg" --max-one-result; then
  echo "Found setup.cfg, checking for python version constraints..."
  fd -t f "setup.cfg" --exec cat {} \; | grep -i "python" || echo "No python constraints found"
fi

echo ""
echo "=== Checking README for Python version mentions ==="
if fd -t f "README*" --max-one-result; then
  fd -t f "README*" --exec grep -i "python 3\." {} \; || echo "No Python version mentioned in README"
fi

echo ""
echo "=== Checking workflow file to confirm the change ==="
cat -n .github/workflows/pytorch.yml | sed -n '15,20p'

Repository: Comfy-Org/pyisolate

Length of output: 1474


🏁 Script executed:

#!/bin/bash
# Search for any Python version requirements across the repo
echo "=== Searching for Python version mentions in all text files ==="
rg -i "python.*3\.[0-9]" --type py --type toml --type yaml --type md --type txt -B 2 -A 2 || echo "No Python version mentions found"

echo ""
echo "=== Checking if code uses Python 3.10+ specific features ==="
echo "Checking for match statements (Python 3.10+ feature)..."
rg "^\s*match\s+" --type py -l || echo "No match statements found"

echo ""
echo "=== Searching for any .python-version or pyenv files ==="
fd -t f ".python-version" -o "pyenv" --hidden || echo "No python-version or pyenv files"

echo ""
echo "=== Checking git history for the workflow change ==="
git log --oneline --all -- ".github/workflows/pytorch.yml" | head -20 || echo "Cannot access git history"

echo ""
echo "=== Checking file structure for setup/config files ==="
ls -la | grep -E "setup|pyproject|requirements" || echo "No setup/config files found in root"

Repository: Comfy-Org/pyisolate

Length of output: 4165


Update documentation to reflect Python 3.10+ minimum requirement.

The Python version matrix change from ['3.9', '3.11'] to ['3.10', '3.11'] aligns with the project's documented minimum Python version of >=3.10 in pyproject.toml. However, the following documentation files still incorrectly claim Python 3.9+ support and need to be updated:

  • README.md: Currently states "Python 3.9+"
  • README_COMFYUI.md: Currently states "Python 3.9+"
  • BENCHMARK_INSTRUCTIONS.md: Currently states "Python 3.9 or higher"
🤖 Prompt for AI Agents
In @.github/workflows/pytorch.yml at line 17, Update the documentation strings
that still claim Python 3.9 support to reflect the new minimum of Python 3.10:
edit README.md and README_COMFYUI.md to change occurrences of "Python 3.9+" to
"Python 3.10+" and edit BENCHMARK_INSTRUCTIONS.md to change "Python 3.9 or
higher" to "Python 3.10 or higher" (search for those exact phrases to locate the
lines).

pytorch-version: ['2.0.0', '2.1.0', '2.2.0', '2.3.0']

steps:
Expand All @@ -25,6 +25,9 @@ jobs:
with:
python-version: ${{ matrix.python-version }}

- name: Install bubblewrap
run: sudo apt-get update && sudo apt-get install -y bubblewrap

- name: Install uv
uses: astral-sh/setup-uv@v3

Expand Down Expand Up @@ -71,12 +74,12 @@ jobs:
- name: Install uv
uses: astral-sh/setup-uv@v3

- name: Install NVIDIA GPU drivers
- name: Install NVIDIA GPU drivers and bubblewrap
run: |
# Note: GitHub Actions doesn't have GPU support, but we can still test CUDA builds
# The tests will run on CPU but with CUDA-enabled PyTorch builds
sudo apt-get update
sudo apt-get install -y nvidia-cuda-toolkit
sudo apt-get install -y nvidia-cuda-toolkit bubblewrap

- name: Install PyTorch with CUDA
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.10', '3.11', '3.12']
python-version: ['3.10', '3.11', '3.12']
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify declared Python support in packaging metadata and docs
rg -n --glob '!**/.venv/**' 'python_requires|Programming Language :: Python :: 3\.9|3\.9' pyproject.toml setup.cfg setup.py README* docs* 2>/dev/null || echo "No matches found or files don't exist"

Repository: Comfy-Org/pyisolate

Length of output: 162


Update README files to reflect minimum Python version change.
Line 17 removes Python 3.9 from the CI matrix, but README.md (line 11) and README_COMFYUI.md (line 19) still claim "Python 3.9+" support. Update these files to reflect the new minimum version (3.10) or restore 3.9 to the CI matrix if it should remain supported.

🤖 Prompt for AI Agents
In @.github/workflows/windows.yml at line 17, The README files still state
Python "3.9+" while the CI matrix in .github/workflows/windows.yml now tests
Python versions ['3.10','3.11','3.12']; update README.md and README_COMFYUI.md
to change any references of "Python 3.9+" (or similar phrasing) to "Python
3.10+" (or the correct minimum) so docs match the workflow, or alternatively
restore '3.9' into the python-version list in .github/workflows/windows.yml if
3.9 should remain supported.


steps:
- uses: actions/checkout@v4
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -154,3 +154,6 @@ cython_debug/

# UV cache directory (for hardlinking optimization)
.uv_cache/

# Generated demo venvs
comfy_hello_world/node-venvs/
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ repos:
- id: debug-statements

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.8
rev: v0.14.0
hooks:
- id: ruff
args: [--fix]
args: [--fix, --unsafe-fixes]
- id: ruff-format
Comment on lines 14 to 19
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

🧩 Analysis chain

🌐 Web query:

Ruff v0.14.0 release notes breaking changes

💡 Result:

Ruff 0.14.0 (released 2025-10-07) — breaking changes:

  • Updated the default and "latest" Python versions to 3.14. [1][2]

Sources:
[1] Ruff changelog (0.14.0).
[2] Ruff GitHub release notes (0.14.0).


🌐 Web query:

Ruff --unsafe-fixes pre-commit hook implications semantic changes

💡 Result:

Short answer: enabling --unsafe-fixes in a pre-commit hook can change program semantics or remove comments and therefore can break behavior — treat it as non-automatic and require human review. See official docs: Ruff classifies fixes as safe vs unsafe; unsafe fixes "could lead to a change in runtime behavior, the removal of comments, or both." [1][2].

Key implications and practical recommendations:

  • Risk: unsafe fixes may alter runtime behavior (e.g., rewrite constructs to faster but semantically-different forms) or remove information (comments) that matters for runtime/maintenance. Test suites may miss some regressions. [1][2][5]
  • Visibility: by default Ruff hides/does not apply unsafe fixes; you must opt into them with --unsafe-fixes or unsafe-fixes = true. That design signals extra caution. [2][4]
  • Pre-commit concerns: running --unsafe-fixes automatically on every commit can silently introduce behavioral changes into commits without explicit review — not recommended for unreviewed developer hooks. [2][4]
  • Safer alternatives:
    • Use only safe automatic fixes in pre-commit (omit --unsafe-fixes) and require manual/CI application of unsafe fixes. [1][7]
    • If you must allow unsafe fixes in a hook, restrict them to a narrow set of rules (promote specific rules via lint.extend-safe-fixes or configure fixable/unfixable rules) so the hook only applies well-understood transforms. [1]
    • Run fixes but require a staged/interactive review step (e.g., fail the hook if any unsafe-fix is available and surface the candidate changes for the developer to apply/review). [2][6]
    • Apply unsafe fixes behind CI jobs or feature-branch maintenance runs (with tests) rather than on every developer commit. [2][3]
  • Monitoring: enable Ruff's JSON output to see the applicability/safety metadata for each suggested fix, and keep a changelog or PR checklist when applying unsafe fixes. [1]

If you want, I can:

  • Suggest a concrete pre-commit config snippet that enables only safe fixes and surfaces unsafe-fix hints, or
  • Draft a policy for when and how to accept unsafe fixes (CI-only, rule whitelist, review checklist).

Sources:

  1. Ruff docs — Fixes / Fix safety. [astral.sh docs]. [1]
  2. Ruff release notes / blog explaining unsafe fixes are hidden by default and require --unsafe-fixes. [3][4]
  3. Ruff FAQ ("Ruff tried to fix something — but it broke my code"). [6]
  4. Settings docs (unsafe-fixes, fixable/unfixable, extend-safe-fixes). [7]
  5. Examples and explanation of unsafe transform (RUF015 example). [5]
  6. Calmcode explainer showing the CLI prompt and hidden unsafe fixes. [8]

Confirm Ruff v0.14.0 upgrade + unsafe-fix policy.

Auto-applying --unsafe-fixes can introduce semantic changes and remove comments; Ruff's documentation explicitly warns against this and hides unsafe fixes by default for good reason. Please confirm this is intentional for all commits. Ruff v0.14.0 is compatible with your hooks (main change: default Python version updated to 3.14).

If you want unsafe fixes only on demand, consider gating them behind a manual stage:

🔧 Gate unsafe fixes for manual review
   - repo: https://github.com/astral-sh/ruff-pre-commit
     rev: v0.14.0
     hooks:
       - id: ruff
-        args: [--fix, --unsafe-fixes]
+        args: [--fix]
+      - id: ruff
+        name: ruff-unsafe-fixes
+        args: [--fix, --unsafe-fixes]
+        stages: [manual]
       - id: ruff-format
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.8
rev: v0.14.0
hooks:
- id: ruff
args: [--fix]
args: [--fix, --unsafe-fixes]
- id: ruff-format
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.0
hooks:
- id: ruff
args: [--fix]
- id: ruff
name: ruff-unsafe-fixes
args: [--fix, --unsafe-fixes]
stages: [manual]
- id: ruff-format
🤖 Prompt for AI Agents
In @.pre-commit-config.yaml around lines 14 - 19, The pre-commit hook for Ruff
(repo https://github.com/astral-sh/ruff-pre-commit, rev v0.14.0) currently
passes the --unsafe-fixes flag on the hook with id: ruff; confirm whether you
intend to apply unsafe fixes on every commit — if not, remove --unsafe-fixes
from the args for the ruff hook (leaving ruff-format as-is) or else move the
unsafe option into a separate, opt-in hook (e.g., a new ruff-unsafe entry) or CI
job that runs only on demand and document its usage so unsafe fixes are not
auto-applied on all commits.

Loading