Skip to content

Add CI, contributor infrastructure, and dual licensing#19

Merged
AnExiledDev merged 3 commits intomainfrom
repo-quality-checklist
Feb 26, 2026
Merged

Add CI, contributor infrastructure, and dual licensing#19
AnExiledDev merged 3 commits intomainfrom
repo-quality-checklist

Conversation

@AnExiledDev
Copy link
Owner

@AnExiledDev AnExiledDev commented Feb 26, 2026

Summary

Addresses the public repo quality audit (REPO-AUDIT.md), raising the score from 22/50 (44%) → 42/50 (84%). Adds CI enforcement, contributor-facing files, issue/PR hygiene, and dual licensing infrastructure.

New files (9)

  • .github/workflows/ci.yml — test + lint jobs (Node 18, Biome)
  • .github/workflows/codeql.yml — CodeQL security analysis (JS)
  • .github/dependabot.yml — weekly updates for npm + GitHub Actions
  • .github/ISSUE_TEMPLATE/bug-report.yml — structured bug report form
  • .github/ISSUE_TEMPLATE/feature-request.yml — structured feature request form
  • .github/ISSUE_TEMPLATE/config.yml — blank issues + commercial licensing contact
  • .github/pull_request_template.md — PR checklist template
  • CONTRIBUTING.md — contribution guidelines, GPL-3.0 + CLA
  • CLA.md — Individual Contributor License Agreement

Modified files (40)

  • README.md — CI badge, Contributing + License sections with commercial notice
  • LICENSE.txt — dual licensing header (GPL-3.0 + commercial)
  • .devcontainer/CHANGELOG.md — changelog entries for all additions
  • setup.js, test.js, 34 shell scripts — SPDX copyright headers

Post-merge actions

After this PR merges, run these gh commands:

# Add topics
gh repo edit --add-topic devcontainer,claude-code,ai-development,developer-tools,vscode,devpod

# Add breaking-change label
gh label create "breaking-change" -c "D93F0B" -d "Introduces a breaking change"

# Enable discussions
gh repo edit --enable-discussions

# Enable vulnerability alerts
gh api -X PUT repos/AnExiledDev/CodeForge/vulnerability-alerts

# Add required_status_checks to ruleset (after CI runs on main)
# Update ruleset 13201702 to require `test` and `lint` job status checks

Manual setup (outside this PR)

  • CLA Assistant — Install from cla-assistant.io, configure to reference CLA.md
  • Commercial license template — Prepare offline, does not go in repo

Test plan

  • npm test passes
  • CI workflow triggers on this PR (test + lint jobs)
  • CodeQL analysis initiates
  • Issue templates visible at Issues → New Issue after merge
  • PR template pre-fills on new PRs
  • CI badge renders in README
  • SPDX headers present in source files (spot-check)

Summary by CodeRabbit

  • Documentation

    • Added contribution guidelines and contributor license agreement
    • Implemented dual licensing (GPL-3.0 and Commercial) and updated README with CI badge and licensing details
    • Added Status Bar Widgets documentation
  • Chores

    • Introduced CI workflows for tests, linting, and security analysis; enabled Dependabot
    • Added issue and pull request templates and other repository metadata
    • Added SPDX license headers across scripts and tooling files

- CI workflow with test + lint jobs (Node 18, Biome)
- CodeQL security analysis (JS, weekly + PR triggers)
- Dependabot for npm and GitHub Actions dependencies
- Bug report and feature request issue templates (YAML forms)
- Pull request template with checklist
- CONTRIBUTING.md with GPL-3.0 and CLA guidelines
- CLA.md (Individual Contributor License Agreement)
- Dual licensing header in LICENSE.txt
- Contributing and License sections in README.md with CI badge
- SPDX copyright headers (GPL-3.0-only) in all 36 source files
- Updated REPO-AUDIT.md scorecard (22/50 → 42/50)
@coderabbitai
Copy link

coderabbitai bot commented Feb 26, 2026

📝 Walkthrough

Walkthrough

Adds SPDX GPL-3.0-only headers across many scripts and source files, introduces CI (test/lint) and CodeQL workflows, Dependabot, GitHub issue/PR templates, contributing and CLA documents, LICENSE updates, and README/CHANGELOG documentation entries.

Changes

Cohort / File(s) Summary
Devcontainer & feature scripts
\.devcontainer/connect-external-terminal.sh, \.devcontainer/features/*/install.sh, \.devcontainer/features/mcp-qdrant/poststart-hook.sh, \.devcontainer/features/.../poststart-hook.sh, \.devcontainer/scripts/*
Inserted SPDX-License-Identifier: GPL-3.0-only and copyright header lines in shell scripts only; no logic or control-flow changes.
Source scripts
setup.js, test.js
Added SPDX/copyright headers; updated Node module paths to node: namespace and switched some string concatenations to template literals in setup.js; test formatting and assertion messaging changed in test.js.
GitHub workflows
.github/workflows/ci.yml, .github/workflows/codeql.yml
Added CI workflow (test + lint on Node 18) and CodeQL analysis workflow with scheduled runs and JS language matrix.
Issue & PR templates
.github/ISSUE_TEMPLATE/bug-report.yml, .github/ISSUE_TEMPLATE/feature-request.yml, .github/ISSUE_TEMPLATE/config.yml, .github/pull_request_template.md
Added structured bug/feature issue templates and PR template; configured issue template options and contact link for commercial licensing.
Dependabot
.github/dependabot.yml
Configured Dependabot updates for npm (root and /docs) and GitHub Actions on a weekly cadence.
Licensing & governance docs
LICENSE.txt, CLA.md, CONTRIBUTING.md
Added dual-licensing notice, individual contributor license agreement, and contribution guidelines (CLA requirement, code style, PR process).
Docs / meta
README.md, .devcontainer/CHANGELOG.md
Added CI badge, contributing/license notes, and unreleased changelog entries documenting templates, CI, and SPDX headers.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

Possibly related PRs

Poem

🐇 I hopped through files at break of day,

Headers snug in neat array,
CI hums, templates take their place,
CLA and license—order, grace.
A tiny rabbit leaves a trace ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 33.33% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title accurately summarizes the main changes: adding CI workflows, contributor infrastructure files, and dual licensing support across the repository.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch repo-quality-checklist

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-advanced-security
Copy link

This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
.devcontainer/features/claude-session-dashboard/install.sh (1)

12-13: ⚠️ Potential issue | 🔴 Critical

Validate PORT to prevent command injection.

The PORT variable is used to construct a shell alias (line 117) that gets written to shell rc files (line 132) without validation. If an attacker can control the PORT environment variable during container build, they could inject malicious shell commands that would execute every time the user opens a shell.

For example, setting PORT='7847"; rm -rf / #' would result in a dangerous alias being written to .bashrc/.zshrc.

🛡️ Proposed fix to add PORT validation
 # === IMPORT OPTIONS ===
 DASHBOARD_VERSION="${VERSION:-latest}"
 PORT="${PORT:-7847}"
 SHELLS="${SHELLS:-both}"
 USERNAME="${USERNAME:-automatic}"
+
+# === VALIDATE INPUT ===
+if ! [[ "${PORT}" =~ ^[0-9]+$ ]] || [ "${PORT}" -lt 1 ] || [ "${PORT}" -gt 65535 ]; then
+    echo "[claude-session-dashboard] ERROR: PORT must be a number between 1 and 65535"
+    exit 1
+fi

And move the existing validation section comment at line 43 down, or rename it to avoid duplication.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.devcontainer/features/claude-session-dashboard/install.sh around lines 12 -
13, The PORT variable (set as PORT="${PORT:-7847}") is used unvalidated when
writing a shell alias and can allow command injection; change this by
validating/sanitizing PORT right after it is read: allow only digits (e.g.,
enforce a regex like '^[0-9]+$') and optionally check a reasonable port range
(e.g., 1–65535), fallback to the default 7847 if validation fails, and use the
sanitized numeric value when constructing the alias (so update the alias-writing
code that references PORT to use the validated variable); also adjust or rename
the existing validation section comment to avoid duplication.
test.js (1)

5-7: ⚠️ Potential issue | 🟡 Minor

Address CI pipeline failures for linting errors.

The CI pipeline reports errors and warnings on these lines:

  1. Lines 5-6: Use node: protocol for built-in modules
  2. Line 7: copyDirectory and main are imported but never used (the "existence" tests on lines 13/16 don't actually use them)

The unused imports are flagged as errors and will fail CI. Consider either removing them or writing actual tests that exercise these functions.

🔧 Proposed fix for the import issues
-const fs = require('fs');
-const path = require('path');
-const { copyDirectory, main } = require('./setup.js');
+const fs = require('node:fs');
+const path = require('node:path');
+const { copyDirectory, main } = require('./setup.js');

 function runTests() {
   console.log('🧪 Running CodeForge package tests...\n');

-  // Test 1: copyDirectory function exists
-  console.log('✓ Test 1: copyDirectory function exists');
-  
-  // Test 2: main function exists  
-  console.log('✓ Test 2: main function exists');
+  // Test 1: copyDirectory function exists and is callable
+  if (typeof copyDirectory === 'function') {
+    console.log('✓ Test 1: copyDirectory function exists');
+  } else {
+    console.log('❌ Test 1: copyDirectory is not a function');
+  }
+
+  // Test 2: main function exists and is callable
+  if (typeof main === 'function') {
+    console.log('✓ Test 2: main function exists');
+  } else {
+    console.log('❌ Test 2: main is not a function');
+  }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` around lines 5 - 7, Replace the built-in module requires to use the
node: protocol (change require('fs') and require('path') to require('node:fs')
and require('node:path')), and remove or use the unused imports copyDirectory
and main from setup.js — either delete the destructuring import of
copyDirectory/main if you only need existence checks, or modify the tests around
the "existence" assertions (referenced near the existing tests on lines that
check presence) to actually call copyDirectory and main so they are exercised;
ensure references are to the symbols copyDirectory and main to resolve the lint
errors.
🧹 Nitpick comments (5)
.devcontainer/features/tmux/install.sh (1)

18-18: Consider pinning package versions for reproducible builds.

The apt-get install command doesn't pin versions for tmux and inotify-tools, which could lead to inconsistent builds across time. Consider pinning specific versions to ensure reproducibility.

📌 Example: pinning package versions
# First, determine current versions (run once to find versions)
# apt-cache policy tmux inotify-tools

# Then pin them in the install command:
apt-get install -y \
  tmux=3.3a-3 \
  inotify-tools=3.22.6.0-4

Note: Replace version numbers with the actual versions you want to standardize on.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.devcontainer/features/tmux/install.sh at line 18, Pin the tmux and
inotify-tools packages in the install step to ensure reproducible builds: run
apt-cache policy for tmux and inotify-tools to capture the desired package
versions, then update the apt-get install command in install.sh (the line
invoking apt-get install -y tmux inotify-tools) to include explicit version pins
like tmux=<version> and inotify-tools=<version>; keep the -y flag and ensure the
chosen versions exist in the configured apt sources so the installation remains
non-interactive and reproducible.
test.js (1)

55-55: Use octal literal instead of parseInt.

The CI pipeline warns that parseInt('111', 8) can be replaced with an octal literal for clarity.

✨ Proposed fix
-  if (setupStat.mode & parseInt('111', 8)) {
+  if (setupStat.mode & 0o111) {
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` at line 55, Replace the parseInt call in the condition so it uses a
JavaScript octal literal: change the expression that reads setupStat.mode &
parseInt('111', 8) to use the octal literal 0o111 (keeping the bitwise check
logic intact), i.e., update the condition involving setupStat.mode to use 0o111
instead of parseInt.
setup.js (2)

84-86: Consider using template literals for string concatenation.

The CI pipeline flags string concatenation on lines 85, 95, and 104. Template literals improve readability.

✨ Proposed fix for line 85
-			const relativePath = relativeBase
-				? relativeBase + "/" + entry.name
-				: entry.name;
+			const relativePath = relativeBase
+				? `${relativeBase}/${entry.name}`
+				: entry.name;

Similar changes would apply to lines 95 and 104:

// Line 95
fs.copyFileSync(destPath, `${destPath}.bak`);

// Line 104
fs.copyFileSync(srcPath, `${destPath}.codeforge-new`);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@setup.js` around lines 84 - 86, Replace string concatenation with template
literals to improve readability: build relativePath using a template literal
instead of relativeBase + "/" + entry.name (update where relativePath is
assigned), and change the two fs.copyFileSync calls that create
backups/temporary files to use template literals for the destination filenames
(the calls that currently produce destPath + ".bak" and destPath +
".codeforge-new"). Keep the same variable names (relativeBase, entry.name,
destPath, srcPath) and ensure behavior is identical.

5-6: Consider addressing CI warnings for node: protocol.

The CI pipeline warns that Node.js built-in modules should use the node: protocol. While these are warnings (not errors), addressing them would align with modern Node.js best practices.

✨ Proposed fix
-const fs = require("fs");
-const path = require("path");
+const fs = require("node:fs");
+const path = require("node:path");
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@setup.js` around lines 5 - 6, Update the CommonJS requires to use the node:
protocol to satisfy CI warnings: replace require("fs") with require("node:fs")
and require("path") with require("node:path") wherever they are imported (the
require calls that assign to the fs and path variables in setup.js), and run
tests to ensure no runtime changes are introduced.
.github/workflows/ci.yml (1)

22-26: Consider installing dependencies before running Biome via npx.

Similar to the test job, if Biome is declared as a devDependency, running npm ci first would ensure consistent versions and faster execution compared to npx downloading it each time.

♻️ Proposed fix with dependency installation and caching
       - uses: actions/setup-node@v4
         with:
           node-version: 18
+          cache: 'npm'
+      - run: npm ci
-      - run: npx `@biomejs/biome` check setup.js test.js
+      - run: npx biome check setup.js test.js
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/ci.yml around lines 22 - 26, Add an explicit dependency
install and optional cache before the Biome npx step: insert a step that runs
`npm ci` (or `npm install --prefer-offline` for speed) prior to the `run: npx
`@biomejs/biome` check setup.js test.js` step, and optionally add an
`actions/cache@v4` step keyed on package-lock.json (or package.json) to persist
node_modules between runs; update the workflow so the `npx `@biomejs/biome` check
setup.js test.js` invocation occurs after the `npm ci` step to ensure Biome from
devDependencies is installed and consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.devcontainer/features/ccburn/install.sh:
- Around line 2-3: Replace the existing SPDX header line (the current
"SPDX-License-Identifier" declaration at the top of this script) with a
dual-license SPDX expression that explicitly lists both the GPL grant and your
commercial license identifier (for example using "GPL-3.0-only OR
LicenseRef-CodeForge-Commercial"); also ensure you add a corresponding
LicenseRef-CodeForge-Commercial entry in your repository's license documentation
so the custom identifier is defined and discoverable.

In @.github/ISSUE_TEMPLATE/config.yml:
- Around line 3-5: The "Commercial Licensing" contact in config.yml uses a
GitHub noreply address in the url field which will not receive inbound messages;
update the url mailto value for the "Commercial Licensing" entry (the name:
Commercial Licensing block and its url property) to point to a monitored, real
support or sales inbox (e.g., support@yourdomain or licensing@yourdomain) so
licensing inquiries are delivered and monitored.

---

Outside diff comments:
In @.devcontainer/features/claude-session-dashboard/install.sh:
- Around line 12-13: The PORT variable (set as PORT="${PORT:-7847}") is used
unvalidated when writing a shell alias and can allow command injection; change
this by validating/sanitizing PORT right after it is read: allow only digits
(e.g., enforce a regex like '^[0-9]+$') and optionally check a reasonable port
range (e.g., 1–65535), fallback to the default 7847 if validation fails, and use
the sanitized numeric value when constructing the alias (so update the
alias-writing code that references PORT to use the validated variable); also
adjust or rename the existing validation section comment to avoid duplication.

In `@test.js`:
- Around line 5-7: Replace the built-in module requires to use the node:
protocol (change require('fs') and require('path') to require('node:fs') and
require('node:path')), and remove or use the unused imports copyDirectory and
main from setup.js — either delete the destructuring import of
copyDirectory/main if you only need existence checks, or modify the tests around
the "existence" assertions (referenced near the existing tests on lines that
check presence) to actually call copyDirectory and main so they are exercised;
ensure references are to the symbols copyDirectory and main to resolve the lint
errors.

---

Nitpick comments:
In @.devcontainer/features/tmux/install.sh:
- Line 18: Pin the tmux and inotify-tools packages in the install step to ensure
reproducible builds: run apt-cache policy for tmux and inotify-tools to capture
the desired package versions, then update the apt-get install command in
install.sh (the line invoking apt-get install -y tmux inotify-tools) to include
explicit version pins like tmux=<version> and inotify-tools=<version>; keep the
-y flag and ensure the chosen versions exist in the configured apt sources so
the installation remains non-interactive and reproducible.

In @.github/workflows/ci.yml:
- Around line 22-26: Add an explicit dependency install and optional cache
before the Biome npx step: insert a step that runs `npm ci` (or `npm install
--prefer-offline` for speed) prior to the `run: npx `@biomejs/biome` check
setup.js test.js` step, and optionally add an `actions/cache@v4` step keyed on
package-lock.json (or package.json) to persist node_modules between runs; update
the workflow so the `npx `@biomejs/biome` check setup.js test.js` invocation
occurs after the `npm ci` step to ensure Biome from devDependencies is installed
and consistent.

In `@setup.js`:
- Around line 84-86: Replace string concatenation with template literals to
improve readability: build relativePath using a template literal instead of
relativeBase + "/" + entry.name (update where relativePath is assigned), and
change the two fs.copyFileSync calls that create backups/temporary files to use
template literals for the destination filenames (the calls that currently
produce destPath + ".bak" and destPath + ".codeforge-new"). Keep the same
variable names (relativeBase, entry.name, destPath, srcPath) and ensure behavior
is identical.
- Around line 5-6: Update the CommonJS requires to use the node: protocol to
satisfy CI warnings: replace require("fs") with require("node:fs") and
require("path") with require("node:path") wherever they are imported (the
require calls that assign to the fs and path variables in setup.js), and run
tests to ensure no runtime changes are introduced.

In `@test.js`:
- Line 55: Replace the parseInt call in the condition so it uses a JavaScript
octal literal: change the expression that reads setupStat.mode & parseInt('111',
8) to use the octal literal 0o111 (keeping the bitwise check logic intact),
i.e., update the condition involving setupStat.mode to use 0o111 instead of
parseInt.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 356f7ed and 9dbf0fa.

📒 Files selected for processing (48)
  • .devcontainer/CHANGELOG.md
  • .devcontainer/connect-external-terminal.sh
  • .devcontainer/features/agent-browser/install.sh
  • .devcontainer/features/ast-grep/install.sh
  • .devcontainer/features/biome/install.sh
  • .devcontainer/features/ccburn/install.sh
  • .devcontainer/features/ccms/install.sh
  • .devcontainer/features/ccstatusline/install.sh
  • .devcontainer/features/ccusage/install.sh
  • .devcontainer/features/chromaterm/install.sh
  • .devcontainer/features/claude-code-native/install.sh
  • .devcontainer/features/claude-monitor/install.sh
  • .devcontainer/features/claude-session-dashboard/install.sh
  • .devcontainer/features/dprint/install.sh
  • .devcontainer/features/hadolint/install.sh
  • .devcontainer/features/kitty-terminfo/install.sh
  • .devcontainer/features/lsp-servers/install.sh
  • .devcontainer/features/mcp-qdrant/install.sh
  • .devcontainer/features/mcp-qdrant/poststart-hook.sh
  • .devcontainer/features/notify-hook/install.sh
  • .devcontainer/features/ruff/install.sh
  • .devcontainer/features/shellcheck/install.sh
  • .devcontainer/features/shfmt/install.sh
  • .devcontainer/features/tmux/install.sh
  • .devcontainer/features/tree-sitter/install.sh
  • .devcontainer/scripts/check-setup.sh
  • .devcontainer/scripts/setup-aliases.sh
  • .devcontainer/scripts/setup-auth.sh
  • .devcontainer/scripts/setup-config.sh
  • .devcontainer/scripts/setup-migrate-claude.sh
  • .devcontainer/scripts/setup-plugins.sh
  • .devcontainer/scripts/setup-projects.sh
  • .devcontainer/scripts/setup-terminal.sh
  • .devcontainer/scripts/setup-update-claude.sh
  • .devcontainer/scripts/setup.sh
  • .github/ISSUE_TEMPLATE/bug-report.yml
  • .github/ISSUE_TEMPLATE/config.yml
  • .github/ISSUE_TEMPLATE/feature-request.yml
  • .github/dependabot.yml
  • .github/pull_request_template.md
  • .github/workflows/ci.yml
  • .github/workflows/codeql.yml
  • CLA.md
  • CONTRIBUTING.md
  • LICENSE.txt
  • README.md
  • setup.js
  • test.js

Comment on lines +2 to +3
# SPDX-License-Identifier: GPL-3.0-only
# Copyright (c) 2026 Marcus Krueger
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Align SPDX expression with the stated dual-license model.

Line 2 currently declares this file as GPL-only. If this repository is intended to be dual-licensed, scanners and downstream consumers will still treat this file as GPL-only unless the SPDX expression encodes both grants.

Suggested fix
-# SPDX-License-Identifier: GPL-3.0-only
+# SPDX-License-Identifier: GPL-3.0-only OR LicenseRef-CodeForge-Commercial

Also ensure LicenseRef-CodeForge-Commercial is defined in your license documentation.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# SPDX-License-Identifier: GPL-3.0-only
# Copyright (c) 2026 Marcus Krueger
# SPDX-License-Identifier: GPL-3.0-only OR LicenseRef-CodeForge-Commercial
# Copyright (c) 2026 Marcus Krueger
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.devcontainer/features/ccburn/install.sh around lines 2 - 3, Replace the
existing SPDX header line (the current "SPDX-License-Identifier" declaration at
the top of this script) with a dual-license SPDX expression that explicitly
lists both the GPL grant and your commercial license identifier (for example
using "GPL-3.0-only OR LicenseRef-CodeForge-Commercial"); also ensure you add a
corresponding LicenseRef-CodeForge-Commercial entry in your repository's license
documentation so the custom identifier is defined and discoverable.

Comment on lines +3 to +5
- name: Commercial Licensing
url: mailto:696222+AnExiledDev@users.noreply.github.com
about: Inquire about commercial licensing options
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Use a monitored inbox for commercial licensing contact.

Line 4 points to a GitHub noreply address, which is typically non-receiving for inbound licensing inquiries.

Suggested fix
 contact_links:
   - name: Commercial Licensing
-    url: mailto:696222+AnExiledDev@users.noreply.github.com
+    url: mailto:licensing@your-domain.example
     about: Inquire about commercial licensing options
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/ISSUE_TEMPLATE/config.yml around lines 3 - 5, The "Commercial
Licensing" contact in config.yml uses a GitHub noreply address in the url field
which will not receive inbound messages; update the url mailto value for the
"Commercial Licensing" entry (the name: Commercial Licensing block and its url
property) to point to a monitored, real support or sales inbox (e.g.,
support@yourdomain or licensing@yourdomain) so licensing inquiries are delivered
and monitored.

Apply auto-fixes for useNodejsImportProtocol (require("fs") →
require("node:fs")) and useTemplate (string concat → template
literals) in setup.js and test.js. Biome also reformatted test.js
indentation and quotes to match project style.
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (4)
test.js (4)

29-40: Use forEach's index parameter instead of repeated indexOf calls.

indexOf is called twice per iteration (lines 32 and 36), which is unnecessary since forEach provides the index as its second parameter.

♻️ Cleaner approach
-	requiredFiles.forEach((file) => {
+	requiredFiles.forEach((file, index) => {
 		if (fs.existsSync(path.join(__dirname, file))) {
 			console.log(
-				`✓ Test 3.${requiredFiles.indexOf(file) + 1}: ${file} exists`,
+				`✓ Test 3.${index + 1}: ${file} exists`,
 			);
 		} else {
 			console.log(
-				`❌ Test 3.${requiredFiles.indexOf(file) + 1}: ${file} missing`,
+				`❌ Test 3.${index + 1}: ${file} missing`,
 			);
 			allFilesExist = false;
 		}
 	});
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` around lines 29 - 40, The loop over requiredFiles uses
requiredFiles.forEach((file) => { ... }) but calls requiredFiles.indexOf(file)
twice; change the callback to accept the index parameter
(requiredFiles.forEach((file, idx) => { ... })) and replace both indexOf calls
with idx so the test messages and numbering use the provided index; update uses
inside the block that reference the computed index and leave allFilesExist logic
unchanged.

49-60: Use existence check instead of truthy check; also use forEach index parameter.

if (packageJson[field]) is a truthy check, which would incorrectly fail for valid but falsy values (e.g., "version": ""). Use Object.hasOwn() or the in operator for proper existence checks. Also apply the same forEach index fix as suggested for Test 3.

♻️ Suggested fix
-	requiredFields.forEach((field) => {
-		if (packageJson[field]) {
+	requiredFields.forEach((field, index) => {
+		if (Object.hasOwn(packageJson, field)) {
 			console.log(
-				`✓ Test 4.${requiredFields.indexOf(field) + 1}: package.json has ${field}`,
+				`✓ Test 4.${index + 1}: package.json has ${field}`,
 			);
 		} else {
 			console.log(
-				`❌ Test 4.${requiredFields.indexOf(field) + 1}: package.json missing ${field}`,
+				`❌ Test 4.${index + 1}: package.json missing ${field}`,
 			);
 			packageValid = false;
 		}
 	});
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` around lines 49 - 60, The loop over requiredFields uses a truthy
check (if (packageJson[field])) and indexOf to compute test numbers; change it
to check property existence with Object.hasOwn(packageJson, field) (or field in
packageJson) to allow empty/falsy valid values, and use the forEach second param
(index) instead of requiredFields.indexOf(field) to compute the test number;
update references to packageJson, requiredFields, and packageValid accordingly.

43-45: Add error handling around JSON.parse.

If package.json is malformed, JSON.parse will throw an uncaught exception, crashing the test rather than reporting a clear failure. Consider wrapping in try-catch for graceful error reporting.

🛡️ Suggested fix
-	const packageJson = JSON.parse(
-		fs.readFileSync(path.join(__dirname, "package.json"), "utf8"),
-	);
+	let packageJson;
+	try {
+		packageJson = JSON.parse(
+			fs.readFileSync(path.join(__dirname, "package.json"), "utf8"),
+		);
+	} catch (err) {
+		console.log("❌ Test 4: package.json is invalid or unreadable");
+		process.exit(1);
+	}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` around lines 43 - 45, Wrap the JSON.parse call that assigns
packageJson in a try-catch to handle malformed package.json: catch errors thrown
by JSON.parse (and fs.readFileSync) around the expression using JSON.parse(...)
and fs.readFileSync(path.join(__dirname, "package.json"), "utf8"), and on error
throw or log a clear test-friendly message that includes the caught error and
the file path so tests fail with a descriptive message rather than an uncaught
exception.

12-16: Tests 1-2 are no-ops; they don't actually verify anything.

These tests always pass because the functions are destructured from require("./setup.js") at line 7. If they didn't exist, the import would throw before reaching these lines. Consider adding meaningful assertions (e.g., verifying they are functions with typeof).

♻️ Suggested improvement
 	// Test 1: copyDirectory function exists
-	console.log("✓ Test 1: copyDirectory function exists");
+	if (typeof copyDirectory === "function") {
+		console.log("✓ Test 1: copyDirectory function exists");
+	} else {
+		console.log("❌ Test 1: copyDirectory is not a function");
+		process.exit(1);
+	}

 	// Test 2: main function exists
-	console.log("✓ Test 2: main function exists");
+	if (typeof main === "function") {
+		console.log("✓ Test 2: main function exists");
+	} else {
+		console.log("❌ Test 2: main is not a function");
+		process.exit(1);
+	}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test.js` around lines 12 - 16, The tests are no-ops because destructuring the
exports from require("./setup.js") will throw first if missing; replace the two
console-only checks with real assertions that verify the exported symbols are
functions: assert(typeof copyDirectory === "function", "copyDirectory should be
a function") and assert(typeof main === "function", "main should be a
function"); locate the destructured imports (copyDirectory, main) in test.js and
add these typeof assertions (or use your test framework's equivalent) so the
tests fail when the exports are missing or not functions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@test.js`:
- Around line 29-40: The loop over requiredFiles uses
requiredFiles.forEach((file) => { ... }) but calls requiredFiles.indexOf(file)
twice; change the callback to accept the index parameter
(requiredFiles.forEach((file, idx) => { ... })) and replace both indexOf calls
with idx so the test messages and numbering use the provided index; update uses
inside the block that reference the computed index and leave allFilesExist logic
unchanged.
- Around line 49-60: The loop over requiredFields uses a truthy check (if
(packageJson[field])) and indexOf to compute test numbers; change it to check
property existence with Object.hasOwn(packageJson, field) (or field in
packageJson) to allow empty/falsy valid values, and use the forEach second param
(index) instead of requiredFields.indexOf(field) to compute the test number;
update references to packageJson, requiredFields, and packageValid accordingly.
- Around line 43-45: Wrap the JSON.parse call that assigns packageJson in a
try-catch to handle malformed package.json: catch errors thrown by JSON.parse
(and fs.readFileSync) around the expression using JSON.parse(...) and
fs.readFileSync(path.join(__dirname, "package.json"), "utf8"), and on error
throw or log a clear test-friendly message that includes the caught error and
the file path so tests fail with a descriptive message rather than an uncaught
exception.
- Around line 12-16: The tests are no-ops because destructuring the exports from
require("./setup.js") will throw first if missing; replace the two console-only
checks with real assertions that verify the exported symbols are functions:
assert(typeof copyDirectory === "function", "copyDirectory should be a
function") and assert(typeof main === "function", "main should be a function");
locate the destructured imports (copyDirectory, main) in test.js and add these
typeof assertions (or use your test framework's equivalent) so the tests fail
when the exports are missing or not functions.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9dbf0fa and 7956444.

📒 Files selected for processing (2)
  • setup.js
  • test.js
🚧 Files skipped from review as they are similar to previous changes (1)
  • setup.js

@AnExiledDev AnExiledDev merged commit 805e5e4 into main Feb 26, 2026
6 checks passed
@AnExiledDev AnExiledDev deleted the repo-quality-checklist branch February 26, 2026 04:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant