From d366106c090ce8a1d23690c3c0947f3e595d437e Mon Sep 17 00:00:00 2001 From: Shahzaib Date: Mon, 16 Mar 2026 16:33:04 -0700 Subject: [PATCH] Feature Orchestrator Plugin: Skills (2/5) Add all specialized skills for the orchestration pipeline: - codebase-researcher: Systematic codebase exploration with SharePoint support - design-author: Design spec creation with multi-source (repo + SharePoint) - design-reviewer: Inline design review comment processing - feature-planner: PBI decomposition with mandatory file paths and API signatures - pbi-creator: ADO work item creation with dependency linking - pbi-dispatcher-github: Dispatch to Copilot coding agent on GitHub repos - pbi-dispatcher-ado: Dispatch via ADO Agency (future) - pbi-dispatcher-ado-swe: Dispatch via ADO Copilot SWE (tag + assign) - pr-validator: Validate agent PRs against PBI acceptance criteria - references/pbi-template.md: Standard PBI description template --- .../skills/codebase-researcher/SKILL.md | 137 +++++++++ .../skills/design-author/SKILL.md | 181 +++++++++++ .../skills/design-reviewer/SKILL.md | 84 ++++++ .../skills/feature-planner/SKILL.md | 282 ++++++++++++++++++ .../references/pbi-template.md | 71 +++++ .../skills/pbi-creator/SKILL.md | 265 ++++++++++++++++ .../skills/pbi-dispatcher-ado-swe/SKILL.md | 185 ++++++++++++ .../skills/pbi-dispatcher-ado/SKILL.md | 134 +++++++++ .../skills/pbi-dispatcher-github/SKILL.md | 199 ++++++++++++ .../skills/pr-validator/SKILL.md | 151 ++++++++++ 10 files changed, 1689 insertions(+) create mode 100644 feature-orchestrator-plugin/skills/codebase-researcher/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/design-author/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/design-reviewer/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/feature-planner/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/feature-planner/references/pbi-template.md create mode 100644 feature-orchestrator-plugin/skills/pbi-creator/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/pbi-dispatcher-ado-swe/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/pbi-dispatcher-ado/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/pbi-dispatcher-github/SKILL.md create mode 100644 feature-orchestrator-plugin/skills/pr-validator/SKILL.md diff --git a/feature-orchestrator-plugin/skills/codebase-researcher/SKILL.md b/feature-orchestrator-plugin/skills/codebase-researcher/SKILL.md new file mode 100644 index 00000000..d049760b --- /dev/null +++ b/feature-orchestrator-plugin/skills/codebase-researcher/SKILL.md @@ -0,0 +1,137 @@ +--- +name: codebase-researcher +description: Systematically explore codebases to find implementations, patterns, and architecture. Use for "where is X implemented", "how does Y work", "trace the flow of", or any request requiring codebase exploration with evidence-based findings. +--- + +# Codebase Researcher + +Explore this codebase systematically with evidence-based findings. + +## Project Knowledge + +Read `.github/copilot-instructions.md` for project-wide conventions and coding standards. + +## Repository Structure + +Discover the repository structure by exploring the workspace — check for modules, +sub-directories with their own build files, and README files. + +| Module | Purpose | Key Paths | +|--------|---------|-----------| +| *Discover by exploring the workspace* | | | + +**⚠️ CRITICAL: Always search across ALL modules/directories.** Code is often shared or duplicated. + +## Core Principles + +1. **Never guess** — Only report what is actually found in the repo +2. **Always cite sources** — Every finding must include file path and line numbers +3. **Acknowledge gaps** — Explicitly state when something cannot be found +4. **Rate confidence** — Assign HIGH/MEDIUM/LOW to each finding +5. **Search all modules** — Check every relevant directory for each query + +## Research Workflow + +### Step 1: Understand the Target + +Clarify what to find: +- Feature/concept name +- Which layer (client, service, shared, etc.) +- Expected patterns (class names, function signatures) + +### Step 2: Search Strategy + +Execute searches in this order, **always searching across all modules**: + +1. **Semantic search** — Start with natural language query +2. **Grep search** — Exact patterns, class names, error codes +3. **File search** — Find by naming convention (e.g., `**/*Operation*.kt`) +4. **Directory exploration** — List relevant directories in each module +5. **Read files** — Confirm findings with actual code + +### Step 3: Trace Call Chains + +For the feature area being researched, trace the complete flow: +- Identify the entry point +- Follow across module boundaries +- Note threading model and error handling at each boundary + +### Step 4: Identify Invariants + +Search for constraints that govern the affected code: +- Threading annotations, synchronization +- Serialization contracts, protocol versions +- Lifecycle dependencies, feature flags + +### Step 5: Validate Findings + +For each potential finding: +- Read the actual code (don't rely only on search snippets) +- Identify which module it belongs to +- Note the exact location (file + line range) +- Assess confidence level + +### Step 6: Report Results + +```markdown +## Research: [Topic] + +### Findings + +#### Finding 1: [Brief description] +- **Module**: [which module] +- **File**: [path/to/file.ext](path/to/file.ext#L10-L25) +- **Confidence**: HIGH | MEDIUM | LOW +- **Evidence**: [What makes this the right code] + +[Code snippet if helpful] + +#### Finding 2: ... + +### Unknowns & Risk Areas + +- [Thing searched for but not found] +- Search attempts: [what was tried] +- [Areas that might be affected but couldn't confirm] + +### Suggested Next Steps + +- [Additional areas to explore] +- [Related code that might be relevant] +``` + +## Confidence Levels + +| Level | Criteria | +|-------|----------| +| **HIGH** | Exact match. Code clearly implements the feature. Names match. | +| **MEDIUM** | Likely match. Code appears related but naming differs or implementation is partial. | +| **LOW** | Possible match. Found tangentially related code, or inference required. | + +## Data Flow Investigation + +When asked about **what data is returned**, **how data flows**, or **what happens to data**: + +1. **Find the Data Structure** — Confirm the field exists, check serialization +2. **Find Construction/Population Code** — Search for Builder/factory methods +3. **Check Conditional Logic** — Search for `if` statements, feature flag checks, version checks +4. **Trace the Complete Flow** — Follow from entry → processing → response → return + +### Flow Investigation Pitfalls + +❌ Don't stop after finding a field definition — check actual behavior +❌ Don't assume data flows unchanged — check for filtering/transformation +❌ Don't ignore version/flag checks — behavior often changes based on these +✅ Search for Builder usage and construction patterns +✅ Look for Adapter/Converter classes in the flow +✅ Check for conditional logic based on configuration or feature flags + +## Anti-Patterns to Avoid + +| Anti-Pattern | Problem | Correct Approach | +|--------------|---------|------------------| +| Searching only one module | Miss cross-module code | Search ALL modules | +| "This is likely in..." | Speculation without evidence | Search first, report only found | +| Path without line numbers | Imprecise, hard to verify | Always include line numbers | +| Stopping at definition | Misses conditional logic | Trace to construction/adapter | +| Brief summary | Loses detail for next step | Be thorough and comprehensive | diff --git a/feature-orchestrator-plugin/skills/design-author/SKILL.md b/feature-orchestrator-plugin/skills/design-author/SKILL.md new file mode 100644 index 00000000..2ce531b6 --- /dev/null +++ b/feature-orchestrator-plugin/skills/design-author/SKILL.md @@ -0,0 +1,181 @@ +--- +name: design-author +description: Create detailed design specs for features. Use when asked to design a feature, create a design spec, write a design doc, or create an implementation plan. Triggers include "design this feature", "create a design spec", "write a design doc". +--- + +# Design Author + +Create detailed design specs for features, save them locally, and optionally open PRs for review. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `design.docsPath` — where to save design docs (e.g., `design-docs/` or `docs/designs/`) +- `design.templatePath` — path to design spec template (optional) +- `design.folderPattern` — folder naming pattern (e.g., `[{platform}] {featureName}`) +- `design.reviewRepo` — repo for design review PRs (optional) + +If no config, save to `docs/designs/` and use the built-in template below. + +## Design Spec Template + +Key sections every design spec should include: + +1. **Title** — Feature name +2. **Components** — Which modules/repos affected +3. **Problem description** — User problem, business context, examples +4. **Requirements** — Functional requirements (must-have) +5. **System Qualities** — Performance, telemetry, security, supportability +6. **Solution options** — At least 2 options with pseudo code, pros/cons +7. **Solution Decision** — Recommended option with reasoning +8. **API surface** — Public/internal classes, methods (if applicable) +9. **Data flow** — Request/response flow across components +10. **Feature flag** — Flag name and gating strategy (if applicable) +11. **Telemetry** — Key metrics, span names, success/failure signals +12. **Testing strategy** — Unit tests, integration tests, E2E coverage +13. **Rollout plan** — Staged rollout, feature flag configuration +14. **Cross-repo impact** — Which repos need changes and in what order + +If a template file exists at the configured `design.templatePath`, follow that instead. + +## Workflow + +### Step 1: Understand the Feature + +Gather from the developer: +1. What the feature does and why it's needed +2. Which components/flows it affects +3. Scope boundaries (in/out) +4. Any existing designs to reference + +### Step 2: Research the Codebase + +Use the `codebase-researcher` skill to: +- Understand how related functionality currently works +- Identify which repos/files would be affected +- Find existing patterns to follow (feature flags, error handling, telemetry) +- Check for existing design docs on the same topic + +### Step 3: Research Existing Designs + +If `design.docsPath` is configured, search for related designs: +```bash +ls / | grep -i "" +``` +Use existing designs as **style reference and historical context**, not ground truth for behavior. + +### Step 4: Write the Design Spec + +Create the spec at: +``` +//.md +``` + +For the **Solution options** section: +- Always provide at least 2 options +- Include pseudo code / API signatures for each +- List concrete pros/cons +- Clear recommendation in Solution Decision + +### Agent Implementation Notes + +Write the design knowing a coding agent will implement it. Be explicit about: +- Class boundaries and responsibilities +- Threading model +- Error contracts +- Integration points with other modules + +### Step 5: Present Design for Review + +After writing, **STOP and present choices** using `askQuestion`: + +``` +askQuestion({ + question: "Design spec written. What would you like to do?", + options: [ + { label: "📖 Review locally", description: "Open in editor for inline review" }, + { label: "✅ Approve & plan PBIs", description: "Skip PR, move to work item planning" }, + { label: "📋 Open draft PR", description: "Push to review repo as draft PR" }, + { label: "🚀 Open published PR", description: "Push and publish PR for team review" }, + { label: "✏️ Request changes", description: "Tell me what to revise" } + ] +}) +``` + +**MANDATORY**: Wait for the developer's explicit choice. Do NOT auto-select. + +### Step 5a: Local Review (option 1) + +Open the file: `code ""` + +Tell the developer: +> "The spec is open. Here's how to review: +> 1. Click the **+ icon** in the gutter to add inline comments +> 2. When done, click the status bar button to submit comments +> 3. I'll address each comment and present choices again" + +### Step 5b: Push and Create PR (options 3 or 4) + +**Branch naming**: Discover alias from `git config user.email` (strip @domain): +```powershell +$alias = (git config user.email) -replace '@.*', '' +git checkout -b "$alias/design-" +``` + +**Git workflow** (from design docs directory): +```powershell +cd / +git add "" +git commit -m "Add design spec: " +git push origin $BRANCH_NAME +``` + +**Create PR**: Use `gh pr create` or ADO MCP tools if available. +- Set `--draft` for option 3, omit for option 4 +- **PR description**: Use actual line breaks or HTML formatting, NOT literal `\n` escape sequences +- Target branch: `main` (or the repo's default branch) + +Present the PR link: +```markdown +### PR Created +**PR**: [link to PR] +**Status**: Draft / Published + +### How to Review +1. Open the PR link above +2. Use inline commenting to leave feedback +3. When done, say: **"address my design review comments"** +4. I'll read the comments and update the spec + +When the team approves, say: **"design approved, plan the PBIs"** +``` + +### Step 6: Address Review Comments + +When asked to address comments (from PR or local review): +1. Read the feedback (from PR comments or `reviews.json`) +2. For each comment: + - Understand the feedback + - Edit the local design spec to address it + - If on a PR branch, reply to the thread confirming the resolution +3. Commit and push the updates to the same branch +4. Report a summary of changes made +5. Return to Step 5 (present choices again) + +### Step 7: Proceed to Implementation + +When the developer confirms the design is approved: +1. The PR can be completed/merged +2. Hand off to the `feature-planner` skill for PBI decomposition + +## Important Caveats + +- **Existing designs may be outdated** — last-minute PR discussions often cause code to deviate. + Always verify proposed patterns against the **current codebase**, not just existing designs. +- **Use existing designs as style reference**, not as ground truth for current behavior. +- For paths with brackets `[]` or spaces, use PowerShell with `-LiteralPath` + +### Open Questions + +If there are genuine unknowns during design, use `askQuestion` to resolve them interactively, +or list them in the spec for the team to discuss during review. diff --git a/feature-orchestrator-plugin/skills/design-reviewer/SKILL.md b/feature-orchestrator-plugin/skills/design-reviewer/SKILL.md new file mode 100644 index 00000000..d7f14931 --- /dev/null +++ b/feature-orchestrator-plugin/skills/design-reviewer/SKILL.md @@ -0,0 +1,84 @@ +--- +name: design-reviewer +description: Address review comments on design spec files. Use when a developer submits inline review comments and wants them addressed. Triggers include "address review comments", "handle my review", or review comment submission. +--- + +# Design Reviewer + +Address review comments on design spec files. + +## How Comments Are Stored + +Comments are stored in: +``` +.github/design-reviews/reviews.json +``` + +Format: +```json +{ + "reviews": { + "path/to/spec.md": [ + { "line": 30, "text": "Why is this needed?", "lineContent": "the line text" } + ] + } +} +``` + +## Workflow + +### Step 1: Read Review Comments + +1. Read `.github/design-reviews/reviews.json` +2. If a specific spec was mentioned, only process that spec's comments +3. If no comments found: + > "No review comments found. Add comments using the gutter icons in the editor." + +### Step 2: Read Spec Context + +For each comment, read ±5 lines around the comment's line number for full context. + +### Step 3: Evaluate Each Comment + +| Comment Type | How to Identify | Action | +|-------------|----------------|--------| +| **Genuine issue** | Points out bug, inaccuracy, missing info | Update the spec | +| **Improvement** | Suggests better approach | Update if it improves clarity | +| **Question** | "why?", "what?", "how?" | Answer clearly; update spec if answer should be documented | +| **Challenge** | "Are you sure?" | Verify against codebase; update if wrong, explain if correct | +| **Acknowledgment** | "nice", "👍" | Acknowledge briefly, no change | + +### Step 4: Apply Changes + +For each comment requiring a spec update: +1. Read the current content around the target line +2. Make the edit using `replace_string_in_file` + +### Step 5: Clean Up reviews.json + +After addressing all comments for a spec, remove that spec's entry from `reviews.json`. +If no reviews remain, delete the file. + +### Step 6: Present Summary + +```markdown +## Review Comments Addressed + +--- + +### Comment 1: Line N — "[short quote]" + +**Type**: Question / Issue / Improvement / Acknowledgment +**Action**: [What was done or why no change was needed] + +--- + +### Comment 2: Line N — "..." +... +``` + +**Rules:** +- Use `###` heading for EVERY comment — never a table +- Use `---` separators between comments +- If spec was edited, mention what changed +- If no change needed, explain why diff --git a/feature-orchestrator-plugin/skills/feature-planner/SKILL.md b/feature-orchestrator-plugin/skills/feature-planner/SKILL.md new file mode 100644 index 00000000..2c297852 --- /dev/null +++ b/feature-orchestrator-plugin/skills/feature-planner/SKILL.md @@ -0,0 +1,282 @@ +--- +name: feature-planner +description: Decompose features into detailed, repo-targeted work items. Use when asked to "plan this feature", "break this down into PBIs", "decompose this into tasks". Produces a structured plan for developer review — actual work item creation is handled by pbi-creator. +--- + +# Feature Planner + +Decompose features into detailed, right-sized work items for implementation. + +**This skill does NOT create work items.** It produces a plan for developer review. +Once approved, the `pbi-creator` skill handles creation in your tracking system. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `repositories` — repo hosting details (slug, host, baseBranch, accountType) +- `modules` — module-to-repo mapping (each module has a `repo` key pointing to a repository) +- `design.docsPath` — where design specs are stored + +## Repository Routing + +Use the `modules` and `repositories` maps from config to route each work item: + +```json +// Example from config: +"repositories": { + "common-repo": { "slug": "org/common-repo", "baseBranch": "dev" }, + "service-repo": { "slug": "org/service-repo", "baseBranch": "main" } +}, +"modules": { + "core": { "repo": "common-repo", "path": "core/", "purpose": "Shared utilities" }, + "service": { "repo": "service-repo", "purpose": "Backend processing" } +} +``` + +Work items target a **module name**. To find the repo: `modules..repo` → `repositories.`. + +**Routing heuristic:** +1. Shared contracts/data models/utilities → shared module +2. Client-facing API changes → client module +3. Server/service-side processing → service module +4. Most features span a shared module + one consumer — create separate work items for each + +## Workflow + +### Step 1: Check for Approved Design + +1. Check configured `design.docsPath` for a matching design spec +2. If design exists and is approved, use it as the primary source +3. If no design exists, ask the developer whether to create one first +4. For small, single-repo changes, skip design and proceed directly + +### Step 2: Understand the Feature + +Gather: +1. **What** the feature does +2. **Why** it's needed +3. **Which flows** it affects +4. **Scope boundaries** (in/out) + +### Step 3: Research Current Implementation + +Use the `codebase-researcher` skill to understand: +- How related functionality currently works +- Which repos/files would need changes +- Existing patterns to follow +- Test patterns in affected areas + +### Step 4: Decompose into Work Items + +Rules: +1. **One work item per repo** — never span multiple repos +2. **Dependency ordering** — document dependencies explicitly +3. **Right-sized** — each should be implementable in one agent session (~1-3 files, <500 lines) +4. **Self-contained description** — everything the coding agent needs, inline +5. **No local file paths** — the coding agent runs in the cloud with only the target repo cloned + +### Step 5: Write Descriptions + +Each description MUST include: +- **Objective**: What to implement and where +- **Context**: Why this change is needed, how it fits the broader feature +- **Technical Requirements**: Specific implementation guidance — see mandatory rules below +- **Acceptance Criteria**: Concrete, verifiable checklist +- **Dependencies**: Use WI-N references (resolved to AB# later) +- **Files to Modify/Create**: Specific paths extracted from research (see rule below) +- **Testing**: What tests to write + +#### ⚠️ MANDATORY: Preserve Technical Detail from Design Spec + +The coding agent implements ONLY from the PBI description. It does NOT see the design spec, +codebase-context.md, or any other local file. Therefore: + +**Every technical detail the agent needs to write correct code MUST be in the PBI.** + +1. **API signatures**: If the design spec includes method signatures, class interfaces, enum values, + or return types — copy them **verbatim** into the PBI. Do NOT summarize code into prose. + + **Bad** (prose summary — agent will guess the types wrong): + > "Create AuthTabManager that wraps AuthTabIntent.registerActivityResultLauncher() and launch()" + + **Good** (exact signatures from design spec — agent uses correct types): + > "Create `AuthTabManager` that wraps the AndroidX Browser 1.9.0 AuthTab API: + > ```kotlin + > // registerActivityResultLauncher returns ActivityResultLauncher, NOT + > // callback receives AuthTabIntent.AuthResult, NOT Uri + > fun registerLauncher(activity: ComponentActivity, callback: (AuthTabIntent.AuthResult) -> Unit): ActivityResultLauncher { + > return AuthTabIntent.registerActivityResultLauncher(activity, callback) + > } + > + > // launch() takes 3 params: launcher, uri, AND redirectScheme + > fun launch(launcher: ActivityResultLauncher, uri: Uri, redirectScheme: String) { + > AuthTabIntent.Builder().build().launch(launcher, uri, redirectScheme) + > } + > ```" + +2. **Rationale for changes**: Explain WHY something needs to change, not just what. The agent + makes better decisions when it understands the reason. + + **Bad**: "Change browserVersion from 1.7.0 to 1.9.0" + + **Good**: "Change `browserVersion` from `1.7.0` to `1.9.0` because AndroidX Browser 1.9.0 + introduces the `AuthTabIntent` API (Chrome 137+) which this feature depends on. Note: this + version bump changes the `onNewIntent` signature in `ComponentActivity` from + `onNewIntent(intent: Intent)` to `onNewIntent(intent: Intent?)` — any override in existing + code (e.g., `SwitchBrowserActivity`) must be updated to match." + +3. **Breaking side effects**: If a change in this PBI will break other code (even code not in + scope for this PBI), document it explicitly so the agent can fix it or the planner can + create a separate PBI. + + **Example**: "⚠️ Bumping browserVersion to 1.9.0 will break `SwitchBrowserActivity.onNewIntent()` + because the signature changed. Fix the override signature in this same PBI." + +4. **Third-party API details**: When wrapping a new library or API version, include: + - The exact dependency coordinates and version + - Key method signatures the agent needs to call (copied from docs or design spec) + - Any gotchas or differences from the agent's likely assumptions + - What the API returns and what types to expect + +5. **Code snippets from design spec**: If the design spec contains pseudocode, class skeletons, + or implementation patterns, include them in the PBI. The agent benefits enormously from + seeing a code sketch — even if it's pseudocode. + +#### ⚠️ MANDATORY: File Paths Rule + +The **"Files to Modify/Create"** field MUST list specific file paths from the research findings. +This is the single most important factor in coding agent success — it tells the agent WHERE +to look instead of forcing it to search blindly. + +**Good** (specific, extracted from research): +``` +Files to Modify/Create: +- common/common/src/main/java/com/microsoft/identity/common/internal/net/HttpClient.java — add retry logic +- common/common/src/main/java/com/microsoft/identity/common/internal/flight/CommonFlight.java — add RETRY_ENABLED flag +- common/common/src/test/java/com/microsoft/identity/common/internal/net/HttpClientTest.java — new test class +``` + +**Bad** (vague, agent has to guess): +``` +Files to Modify/Create: +- HTTP client module +- Flight definitions +- Tests +``` + +If the research didn't identify specific files for a task, state that explicitly: +``` +Files to Modify/Create: +- Exact paths not identified during research — agent should search for [specific class/pattern] + starting in [module/directory] +``` + +This gives the agent a starting point even when exact paths aren't known. + +### Quality Checklist + +Before finalizing each work item: +- [ ] Could someone unfamiliar implement it from the description alone? +- [ ] Does it explain WHY, not just WHAT? (rationale for every change) +- [ ] Is the scope clear with explicit exclusions? +- [ ] Are acceptance criteria concrete and testable? +- [ ] Is it right-sized? (1-3 files = ideal, >6 files = split it) +- [ ] Does "Files to Modify/Create" list specific paths from research? +- [ ] Are API signatures from the design spec included verbatim (not summarized to prose)? +- [ ] Are breaking side effects documented? (e.g., dependency bump breaks existing code) +- [ ] For third-party API wrapping: are exact method signatures and return types included? + +### Step 6: Present Plan for Review + +Use this **exact output format** — the `pbi-creator` skill depends on it. + +**IMPORTANT**: Do NOT use HTML tags (`
`, ``, etc.) — VS Code chat +renders markdown only. HTML tags appear as raw text. + +#### Output Format + +**1. Header:** + +```markdown +## Feature Plan: [Feature Name] + +**Feature flag**: `[flag_name]` (or "N/A") +**Design spec**: [path] (or "N/A") +**Total work items**: [N] +``` + +**2. Dependency graph:** + +```markdown +### Dependency Graph + +WI-1 (common) → WI-2 (service) + WI-3 (client) [parallel after WI-1] +``` + +**3. Summary table:** + +```markdown +### Summary Table + +| # | Title | Repo | Module | Priority | Depends On | +|---|-------|------|--------|----------|------------| +| WI-1 | [title] | common | shared | P1 | None | +| WI-2 | [title] | service | backend | P1 | WI-1 | +``` + +**4. Dispatch order:** + +```markdown +### Dispatch Order + +1. Dispatch **WI-1** first (no blockers) +2. After WI-1 merges → dispatch **WI-2** and **WI-3** in parallel +``` + +**5. Work item details:** + +```markdown +--- + +#### WI-1: [Title] + +| Field | Value | +|-------|-------| +| **Repo** | `[org/repo-name]` | +| **Module** | `[module]` | +| **Priority** | P[1-3] | +| **Depends on** | None / WI-X | +| **Tags** | `ai-generated; copilot-agent-ready; [feature-tag]` | + +##### Description + +[Full description in PLAIN MARKDOWN with: Objective, Context, Technical Requirements, +Acceptance Criteria, Files to Modify, Testing] +``` + +**6. Next step:** + +```markdown +### Next Step + +> Plan approved? Say **"create the PBIs"** to create work items in your tracking system. +``` + +## Common Patterns + +### Single-Repo Feature +One work item. Most bug fixes and small enhancements. + +### Two-Repo Feature (Shared + Consumer) +1. WI-1: Add shared logic/contract +2. WI-2: Consume from client or service + +### Multi-Repo Feature +1. WI-1: Shared contract/data model +2. WI-2: Service-side processing (depends on WI-1) +3. WI-3: Client-side API (depends on WI-1) +4. WI-4: (optional) Integration tests + +### Feature Flag Convention +All work items for a feature should use the **same feature flag name** across repos. +Include the flag name in each description. diff --git a/feature-orchestrator-plugin/skills/feature-planner/references/pbi-template.md b/feature-orchestrator-plugin/skills/feature-planner/references/pbi-template.md new file mode 100644 index 00000000..460e19dd --- /dev/null +++ b/feature-orchestrator-plugin/skills/feature-planner/references/pbi-template.md @@ -0,0 +1,71 @@ +# Work Item Template + +Use this structure for every work item description. The description must be +**self-contained** — the coding agent only has this text plus the target repo. + +## Objective + +[1-2 sentences: What to implement and in which module/repo] + +## Context + +[Why this change is needed. How it fits into the larger feature. +Include enough background that someone unfamiliar could understand the motivation.] + +## Technical Requirements + +### What to Build + +[Specific implementation details. Include:] + +- Classes/functions to create or modify +- Method signatures and data structures +- Error handling approach +- Threading/concurrency model (if relevant) +- Integration points with other modules + +### Code Patterns to Follow + +[Reference existing patterns in the repo. Include actual code examples +or file paths within the TARGET REPO (not other repos).] + +``` +// Example of the pattern to follow: +class ExistingPattern { + // show the convention +} +``` + +### What NOT to Do + +[Explicit exclusions to prevent scope creep:] +- Do NOT modify [specific files/features outside scope] +- Do NOT add [unnecessary abstractions] +- This work item does NOT cover [related but separate concern] + +## Acceptance Criteria + +- [ ] [Concrete, testable criterion 1] +- [ ] [Concrete, testable criterion 2] +- [ ] [Concrete, testable criterion 3] +- [ ] All existing tests pass +- [ ] New unit tests cover the happy path and error cases +- [ ] Code follows project conventions (from copilot-instructions.md) + +## Files to Modify + +| File | Change | +|------|--------| +| `path/to/file.ext` | [What to change] | +| `path/to/new-file.ext` | [New file — what it contains] | + +## Dependencies + +- **Depends on**: [WI-N / AB#ID — what must be merged first and why] +- **Depended on by**: [WI-N — who is waiting for this] + +## Testing + +- Unit tests for [specific logic] +- Integration tests for [cross-component interaction] (if applicable) +- Test file location: `path/to/tests/` diff --git a/feature-orchestrator-plugin/skills/pbi-creator/SKILL.md b/feature-orchestrator-plugin/skills/pbi-creator/SKILL.md new file mode 100644 index 00000000..efe1edec --- /dev/null +++ b/feature-orchestrator-plugin/skills/pbi-creator/SKILL.md @@ -0,0 +1,265 @@ +--- +name: pbi-creator +description: Create work items in Azure DevOps from a feature plan. Handles ADO metadata discovery (area path, iteration, assignee), work item creation, and dependency linking. Triggers include "create the PBIs", "create work items", "push PBIs to ADO". +--- + +# PBI Creator + +Create Azure DevOps work items from a feature plan produced by the `feature-planner` skill. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `ado.project` — ADO project name (e.g., "Engineering") +- `ado.org` — ADO organization name (e.g., "IdentityDivision") +- `ado.workItemType` — work item type (default: "Product Backlog Item") +- `ado.iterationDepth` — depth for iteration discovery (default: 6) + +### ⚠️ ADO Org/Project Parsing + +The `ado.org` and `ado.project` fields should contain **plain names only**, not full URLs. +If the config contains a URL, extract the relevant part: +- `https://dev.azure.com/IdentityDivision/Engineering/_workitems/edit/123` → org: `IdentityDivision`, project: `Engineering` +- `https://msazure.visualstudio.com/One/_git/repo` → org: `msazure`, project: `One` +- `IdentityDivision` → use as-is + +When calling MCP tools, pass only the **org name** (e.g., `IdentityDivision`) and +**project name** (e.g., `Engineering`), never a full URL with `https:`. URLs with colons +cause ADO API errors: "A potentially dangerous Request.Path value was detected." + +## Prerequisites + +- **ADO MCP Server** must be running (configured in `.mcp.json`) +- A **feature plan** in the current chat context (from `feature-planner` skill) + +## Workflow + +### Step 1: Parse the Feature Plan + +Read the plan from chat context. Extract for each work item: +- **Title** — from `#### WI-N: [Title]` header +- **Repo** — from metadata table +- **Module** — from metadata table +- **Priority** — P1→1, P2→2, P3→3 +- **Depends on** — WI-N references +- **Tags** — from metadata table +- **Description** — from `##### Description` section. **Convert to HTML** for ADO: + - `## Heading` → `

Heading

` + - `**bold**` → `bold` + - `- item` → `
  • item
` + - Or wrap in `
` tags if conversion is complex
+
+If no plan found, ask: "Run the `feature-planner` skill first, or paste PBI details."
+
+### Step 2: Discover ADO Defaults
+
+**Do this BEFORE asking the developer.** This ensures valid options.
+
+1. Call `mcp_ado_wit_my_work_items` to get recent work items
+2. Call `mcp_ado_wit_get_work_items_batch_by_ids` on 3-5 recent items
+3. Extract:
+   - `System.AreaPath` — all unique paths with frequency counts
+   - `System.IterationPath` — note the pattern
+   - `System.AssignedTo` — default assignee
+4. Call `mcp_ado_work_list_iterations` with `depth` from config (default 6)
+5. **Filter iterations to current month or future only** — discard past iterations
+
+### Step 3: Present Options for Confirmation
+
+## ⛔ HARD STOP — DO NOT SKIP THIS STEP
+
+**You MUST complete Step 2 and Step 3 BEFORE creating any work items.**
+Do NOT proceed to Step 4 until the developer has answered ALL four questions.
+This is not optional. This is not a suggestion. **STOP HERE and ask.**
+
+If you skip this step and auto-select defaults, the work items will be created
+in the wrong area path, wrong iteration, or wrong assignee — and the developer
+will have to manually fix every single one.
+
+**Batch ALL questions into a SINGLE `askQuestion` call:**
+
+```
+askQuestion({
+  questions: [
+    {
+      header: "Area Path",
+      question: "Which area path?",
+      options: [
+        { label: "", description: "From your recent work items", recommended: true },
+        { label: "" }
+      ],
+      allowFreeformInput: true
+    },
+    {
+      header: "Iteration",
+      question: "Which iteration? (Current date: )",
+      options: [
+        { label: "", description: "", recommended: true },
+        { label: "" }
+      ],
+      allowFreeformInput: true
+    },
+    {
+      header: "Assignee",
+      question: "Who should be assigned?",
+      options: [
+        { label: "", description: "From recent work items", recommended: true }
+      ],
+      allowFreeformInput: true
+    },
+    {
+      header: "Parent",
+      question: "Link to a parent Feature work item?",
+      options: [
+        { label: "Create new Feature", description: "New Feature titled ''" },
+        { label: "No parent", description: "Standalone PBIs" }
+      ],
+      allowFreeformInput: true
+    }
+  ]
+})
+```
+
+Wait for ALL answers before proceeding.
+
+### Step 4: Create Work Items
+
+Use `mcp_ado_wit_create_work_item` for each item in **dependency order**.
+
+**CRITICAL parameters** (read project from config):
+```json
+{
+  "project": "",
+  "workItemType": "",
+  "fields": [
+    {"name": "System.Title", "value": "[title]"},
+    {"name": "System.Description", "value": "[HTML description]", "format": "Html"},
+    {"name": "System.AreaPath", "value": "[confirmed path]"},
+    {"name": "System.IterationPath", "value": "[confirmed iteration]"},
+    {"name": "System.AssignedTo", "value": "[confirmed assignee]"},
+    {"name": "Microsoft.VSTS.Common.Priority", "value": "[number]"},
+    {"name": "System.Tags", "value": "[semicolon-separated tags]"}
+  ]
+}
+```
+
+**Common mistakes to avoid:**
+- Do NOT use top-level `title`, `description`, `areaPath` — they don't exist
+- The param is `workItemType`, NOT `type`
+- Description must be **HTML** with `"format": "Html"`
+- Tags are semicolon-separated
+- Area/iteration paths use backslashes
+- **Never hardcode paths** — use developer-confirmed values
+- **MUST include Area Path AND Iteration Path** — these come from Step 3 confirmations.
+  If you don't have them, you skipped Step 3. Go back.
+
+### ⚠️ Title Sanitization
+
+**Remove colons (`:`) from work item titles.** The ADO REST API encodes titles in the
+URL path, and colons trigger an HTTP 400 error: "A potentially dangerous Request.Path
+value was detected from the client (:)."
+
+Instead of: `WI-1: Add feature flag and ECS flight`
+Use: `WI-1 — Add feature flag and ECS flight` (em-dash) or just `Add feature flag and ECS flight`
+
+Also avoid these characters in titles: `<`, `>`, `#`, `%`, `{`, `}`, `|`, `\`, `^`, `~`, `[`, `]`, `` ` ``
+
+### ⚠️ NEVER Create Work Items With Minimal Descriptions
+
+**Every work item MUST include the FULL description from the feature plan.** This is the
+entire point of the orchestrator — the coding agent implements from the PBI description alone.
+
+If `mcp_ado_wit_create_work_item` fails:
+1. **Check the error** — is it a title character issue? Sanitize and retry.
+2. **Retry the same tool** with corrected input.
+3. **If the tool keeps failing**, report the error to the developer and ask them to help
+   troubleshoot the MCP server.
+
+**NEVER fall back to a different tool that creates work items without the full description.**
+**NEVER tell the user "descriptions are summaries" or suggest they update them manually.**
+If you can't create work items with full descriptions, STOP and report the failure.
+A PBI without a proper description is worse than no PBI at all.
+
+After each creation, record the returned `id` and map WI-N → AB#[id].
+
+### Step 5: Resolve Dependencies + Parent Links
+
+1. **Update descriptions**: Replace WI-N references with AB#[id] in each description
+2. **Link dependencies**: Use `mcp_ado_wit_work_items_link`:
+   ```json
+   {"updates": [{"id": [dependent], "linkToId": [dependency], "type": "predecessor"}]}
+   ```
+3. **Parent to Feature** (if created): Use `mcp_ado_wit_add_child_work_items`
+
+### Step 5.5: Mark as Committed
+
+Update all work items to **Committed** state:
+```json
+{"id": [id], "fields": [{"name": "System.State", "value": "Committed"}]}
+```
+
+### Step 6: Report Summary
+
+```markdown
+## Work Items Created: [Feature Name]
+
+| # | AB# | Title | Repo | Depends On | State | Link |
+|---|-----|-------|------|------------|-------|------|
+| WI-1 | AB#12345 | [title] | common | — | Committed | [link] |
+| WI-2 | AB#12346 | [title] | service | AB#12345 | Committed | [link] |
+
+### Settings Used
+- **Parent Feature**: AB#12340 (or "None")
+- **Area Path**: `[path]`
+- **Iteration**: `[path]`
+- **Assigned to**: `[assignee]`
+
+### Dispatch Order
+1. Dispatch **AB#12345** first
+2. After merge → dispatch **AB#12346** and **AB#12347** in parallel
+
+### Next Step
+> Say **"dispatch"** to send the first work item to Copilot coding agent.
+```
+
+## MCP Server Recovery
+
+If ADO MCP tools fail mid-workflow:
+1. Restart: Command Palette → `MCP: Restart Server` → `ado`
+2. If still broken, try a **new chat session**
+3. **Preserve progress**: Note which items were created (AB# IDs) so the new
+   session can continue without duplicating work items
+4. In the new session, the developer can say:
+   > "Continue creating PBIs for [feature]. WI-1 already created as AB#12345. Create WI-2 onwards."
+
+## Edge Cases
+
+### Plan has a single PBI
+Skip dependency linking. Create one work item and report.
+
+### Developer wants different area paths per PBI
+If PBIs target different teams or modules, ask if they want different area paths.
+Present discovered options for each PBI individually.
+
+### Developer modifies the plan before approving
+If the developer asks for changes (add/remove PBIs, change descriptions), defer back
+to the `feature-planner` skill to regenerate, then return here for creation.
+
+### Creating a Parent Feature Work Item
+
+If the developer wants a parent Feature, create it first:
+```json
+{
+  "project": "",
+  "workItemType": "Feature",
+  "fields": [
+    {"name": "System.Title", "value": "[Feature Name]"},
+    {"name": "System.Description", "value": "

[Brief description]

", "format": "Html"}, + {"name": "System.AreaPath", "value": "[confirmed path]"}, + {"name": "System.IterationPath", "value": "[confirmed iteration]"}, + {"name": "System.AssignedTo", "value": "[confirmed assignee]"}, + {"name": "System.Tags", "value": "ai-generated"} + ] +} +``` +Record the Feature ID for parenting PBIs. diff --git a/feature-orchestrator-plugin/skills/pbi-dispatcher-ado-swe/SKILL.md b/feature-orchestrator-plugin/skills/pbi-dispatcher-ado-swe/SKILL.md new file mode 100644 index 00000000..042521b5 --- /dev/null +++ b/feature-orchestrator-plugin/skills/pbi-dispatcher-ado-swe/SKILL.md @@ -0,0 +1,185 @@ +--- +name: pbi-dispatcher-ado-swe +description: Dispatch work items to Copilot SWE agent for ADO-hosted repos. Tags the work item with the target repo and assigns to GitHub Copilot, which creates a draft PR automatically. +--- + +# PBI Dispatcher — ADO (Copilot SWE) + +Dispatch work items to the Copilot SWE agent for ADO-hosted repos. The agent is triggered +by tagging the work item with the target repo and assigning it to **GitHub Copilot**. +**This skill is for ADO repos only.** For GitHub repos, use `pbi-dispatcher-github`. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `modules` — module-to-repo mapping (each module has a `repo` key) +- `repositories` — repo details: slug (`org/project/repo`), baseBranch, host +- `ado.org` — ADO organization name +- `ado.project` — ADO project name + +To resolve a module to dispatch details: +1. Look up `modules..repo` → get the repo key +2. Look up `repositories.` → get `slug`, `baseBranch` +3. Parse the slug to extract org, project, and repo name + +## Prerequisites + +- **ADO MCP Server** running — for updating work items +- Copilot SWE agent enabled/onboarded for the target ADO repository +- Work items with clear descriptions (from the `pbi-creator` skill) + +## Workflow + +### 1. Read Work Items + +Read PBI details from ADO (via MCP) or from chat context. Need: +- AB# ID (work item ID) +- Target repo module +- Full description should already be on the work item (set by `pbi-creator`) + +### 2. Check Dependencies + +For each work item, check if dependencies (other AB# IDs) have merged PRs. Skip blocked items. + +### 2a. Gather Cross-PBI Context for Dependencies + +For each work item that HAS dependencies on already-merged PBIs, enrich the work item +description with context about what those dependencies changed. This helps the Copilot +SWE agent understand what preceding PBIs introduced. + +For each merged dependency, query the linked PR (check the work item's links or search): +``` +Use mcp_ado_wit_get_work_item to read the dependency work item and check for linked PRs. +``` + +If a merged PR is found, **append** to the work item description (via `mcp_ado_wit_update_work_item`): +``` +## Dependency Context + +This work item depends on already-merged changes: + +### AB#: (PR merged) +Key changes introduced: [summary from PR title and description] +Build on these changes. Do NOT duplicate or re-implement what the dependency already added. +``` + +If no linked PR is found, skip — the PBI description is still self-contained. + +### 3. Tag Work Item with Target Repository + +Add a tag to the work item using the format: +``` +copilot:repo=//@ +``` + +Use `mcp_ado_wit_update_work_item` to add the tag: + +```json +{ + "id": , + "fields": [ + { + "name": "System.Tags", + "value": "; copilot:repo=//@" + } + ] +} +``` + +**Building the tag value** from config: +- The repo slug in config is `org/project/repo` format +- The base branch comes from `repositories..baseBranch` +- Example: slug `msazure/One/AD-MFA-phonefactor-phoneApp-android`, branch `working` + → tag: `copilot:repo=msazure/One/AD-MFA-phonefactor-phoneApp-android@working` + +**⚠️ IMPORTANT:** +- Use only ONE linking method per work item — the tag OR an artifact link, not both +- Only one repository can be linked per work item +- The branch after `@` is required — use the base branch from config +- **Append** the new tag to existing tags (semicolon-separated), don't overwrite them. + Read existing tags first via `mcp_ado_wit_get_work_item`, then append. + +### 4. Assign to GitHub Copilot + +Use `mcp_ado_wit_update_work_item` to assign the work item to **GitHub Copilot**: + +```json +{ + "id": , + "fields": [ + { + "name": "System.AssignedTo", + "value": "GitHub Copilot" + } + ] +} +``` + +**Note**: The display name is `GitHub Copilot`. If this doesn't work, the identity may +be registered differently in the org. Check with the user. + +### 5. What Happens Next + +After assignment, the Copilot SWE agent will automatically: +1. Create a **draft/WIP PR** in the target repo +2. Add a **comment to the work item** with the PR link +3. Link the PR to the work item +4. Begin implementing the solution from the work item description + +The agent uses `.github/copilot-instructions.md` in the target repo for coding conventions. + +### 6. Update Orchestrator State + +```powershell +$su = Join-Path $HOME ".feature-orchestrator" "state-utils.js" +node $su set-step "" monitoring +``` + +Note: The PR URL won't be available immediately — the agent takes a few minutes to create +the draft PR. The user can check status later via the Monitor phase. + +### 7. Report Summary + +```markdown +## Dispatch Summary + +| AB# | Repo | Method | Status | +|-----|------|--------|--------| +| AB#12345 | org/project/repo | Copilot SWE | ✅ Tagged & assigned to GitHub Copilot | +| AB#12346 | org/project/repo | Copilot SWE | ⏸ Blocked (waiting on AB#12345) | + +### What to Expect +- The Copilot SWE agent will create a **draft PR** in a few minutes +- It will add a comment on the work item with the PR link +- Once the PR is published, review the changes and add comments to iterate +- Tag `@GitHub Copilot` in PR comments to request changes + +### Next Step +> Check back in a few minutes and say **"status"** to see if the PR has been created. +> Or open the work item in ADO to see the agent's comment with the PR link. +``` + +## Iterating on the PR + +After the agent creates the PR: +- Add comments at the PR level or on specific files +- **Tag `@GitHub Copilot`** in PR comments (the agent won't act without the explicit tag) +- If ADO doesn't auto-complete the @-mention, type the literal text `@` +- The agent will create a new iteration with updates + +## Error Handling + +### "Repository is not yet onboarded" +The target repo needs to be onboarded to the Copilot SWE pilot program. +Guide the user to follow their org's onboarding process. + +### Assignment fails +The `GitHub Copilot` identity may not be available in the org. Check: +- Is Copilot SWE enabled for this ADO organization? +- Is the identity name different? (Try searching for "Copilot" in the assignee field) + +### Tag format errors +Ensure the tag follows exactly: `copilot:repo=//@` +- No spaces around `=` or `@` +- Branch name is required +- Org/project/repo must match exactly what's in ADO diff --git a/feature-orchestrator-plugin/skills/pbi-dispatcher-ado/SKILL.md b/feature-orchestrator-plugin/skills/pbi-dispatcher-ado/SKILL.md new file mode 100644 index 00000000..6b9fe62a --- /dev/null +++ b/feature-orchestrator-plugin/skills/pbi-dispatcher-ado/SKILL.md @@ -0,0 +1,134 @@ +--- +name: pbi-dispatcher-ado +description: Dispatch work items to ADO Agency for ADO-hosted repos. Uses the Agency REST API to create coding agent jobs that produce draft PRs. +--- + +# PBI Dispatcher — ADO (Agency) + +Dispatch work items to ADO Agency for ADO-hosted repos. Agency generates a solution +as a draft pull request in Azure DevOps. +**This skill is for ADO repos only.** For GitHub repos, use `pbi-dispatcher-github`. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `modules` — module-to-repo mapping (each module has a `repo` key) +- `repositories` — repo details: slug (`org/project/repo`), baseBranch, host +- `ado.org` — ADO organization name +- `ado.project` — ADO project name + +To resolve a module to dispatch details: +1. Look up `modules..repo` → get the repo key +2. Look up `repositories.` → get `slug`, `baseBranch`, `host` +3. Parse the slug to extract org, project, and repo name + +## Prerequisites + +- **Azure CLI** (`az`) authenticated — needed to acquire the Agency API token +- Work items in ADO with tag `copilot-agent-ready` +- Agency enabled for the target ADO organization/project + +## Workflow + +### 1. Read Work Items + +Read PBI details from ADO (via MCP) or from the chat context. Need: +- AB# ID +- Full description (Objective, Technical Requirements, Acceptance Criteria) +- Target repo module + +### 2. Check Dependencies + +For each work item, check if dependencies (other AB# IDs) have merged PRs. Skip blocked items. + +### 3. Acquire Agency API Token + +Use Azure CLI to get a bearer token for the Agency API: + +```powershell +$token = az account get-access-token --resource "api://81bbac67-d541-4a6d-a48b-b1c0f9a57888" --query accessToken -o tsv +``` + +If this fails: +- Check `az account show` — user may not be authenticated +- Guide: `az login` +- If `az` is not installed, tell the user Agency dispatch requires Azure CLI + +### 4. Dispatch to Agency + +For each ready work item, call the Agency REST API: + +```powershell +$body = @{ + organization = "" + project = "" + repository = "" + targetBranch = "" + prompt = @" +'.> +"@ + options = @{ + pullRequest = @{ + create = $true + publish = $true + } + } +} | ConvertTo-Json -Depth 4 + +$response = Invoke-RestMethod ` + -Uri "https://copilotswe.app.prod.gitops.startclean.microsoft.com/api/agency/jobs" ` + -Method Post ` + -Headers @{ Authorization = "Bearer $token"; "Content-Type" = "application/json" } ` + -Body $body + +Write-Host "Agency job created: $($response | ConvertTo-Json)" +``` + +**Parsing the repo slug** for Agency API parameters: +- Slug format is `org/project/repo` (from config) +- `organization` = first segment (e.g., `msazure`) +- `project` = second segment (e.g., `One`) +- `repository` = third segment (e.g., `AD-MFA-phonefactor-phoneApp-android`) + +**IMPORTANT for prompt content:** +- Include the FULL PBI description (not truncated) +- Include `Fixes AB#` so the PR links to the ADO work item +- Do NOT include local file paths — the agent can't access them + +### 5. Update ADO State + +Mark the ADO work item as `Active`, add tag `agent-dispatched`. + +### 6. Report Summary + +```markdown +## Dispatch Summary + +| AB# | Repo | Method | Status | +|-----|------|--------|--------| +| AB#12345 | org/project/repo | ADO Agency | ✅ Dispatched | +| AB#12346 | org/project/repo | ADO Agency | ⏸ Blocked (waiting on AB#12345) | + +### Next Step +> Say **"status"** to check agent PR progress. +> Agency will create a draft PR in ADO when implementation is ready. +``` + +## Error Handling + +### Token acquisition fails +``` +az account get-access-token --resource "api://81bbac67-d541-4a6d-a48b-b1c0f9a57888" +``` +If this returns an error: +- "AADSTS..." → user may not have access to Agency. They need to request access. +- "Please run 'az login'" → guide the user to authenticate + +### Agency API returns 403 +The user's account may not have Agency enabled for the target repo/org. +Tell the user to check their Agency access at their org's Agency administration page. + +### Agency API returns 400 +Check the request body — ensure org, project, and repository match exactly what's in ADO. +The repository name must match the ADO repo name, not a slug or URL. diff --git a/feature-orchestrator-plugin/skills/pbi-dispatcher-github/SKILL.md b/feature-orchestrator-plugin/skills/pbi-dispatcher-github/SKILL.md new file mode 100644 index 00000000..0798771e --- /dev/null +++ b/feature-orchestrator-plugin/skills/pbi-dispatcher-github/SKILL.md @@ -0,0 +1,199 @@ +--- +name: pbi-dispatcher-github +description: Dispatch work items to GitHub Copilot coding agent for GitHub-hosted repos. Uses `gh agent-task create` to create agent tasks. +--- + +# PBI Dispatcher — GitHub + +Dispatch work items to GitHub Copilot coding agent by creating agent tasks in GitHub-hosted repos. +**This skill is for GitHub repos only.** For ADO repos, use `pbi-dispatcher-ado`. + +## Configuration + +Read `.github/orchestrator-config.json` for: +- `modules` — module-to-repo mapping (each module has a `repo` key) +- `repositories` — repo details: slug, baseBranch, host +- `github.configFile` — per-developer config path (default: `.github/developer-local.json`) + +Read the developer-local config file for GitHub account mapping: +```json +// .github/developer-local.json +{ + "github_accounts": { + "org/common-repo": "johndoe", + "enterprise-org/service-repo": "johndoe_microsoft" + } +} +``` + +To resolve a module to dispatch details: +1. Look up `modules..repo` → get the repo key +2. Look up `repositories.` → get `slug`, `baseBranch`, `host` +3. Look up `developer-local.github_accounts.` → get the GitHub username +4. Run `gh auth switch --user ` before dispatching + +## Prerequisites + +- **GitHub CLI** (`gh`) authenticated +- Work items in ADO with tag `copilot-agent-ready` +- Copilot coding agent enabled on target repos + +## GitHub Account Discovery + +**CRITICAL**: Determine which `gh` CLI accounts to use. **Never hardcode usernames.** + +### Discovery Sequence (stop at first success) + +**Step 0: Verify `gh` CLI is installed:** +```powershell +gh --version +``` +If not found, offer to install: +- Windows: `winget install --id GitHub.cli -e` +- macOS: `brew install gh` + +**Step 1: Check developer config file** (from `github.configFile` in config): +```powershell +$config = Get-Content "" -Raw -ErrorAction SilentlyContinue | ConvertFrom-Json +``` + +**Step 2: Discover from `gh auth status`:** +```powershell +$ghStatus = gh auth status 2>&1 +``` +Map accounts to types: +- Non-EMU account (no `_` suffix) → `public` repos +- EMU account (`_microsoft` suffix) → `emu` repos + +**Step 3: Prompt the developer** (fallback): +> "I need your GitHub usernames: +> 1. **Public GitHub** (for public org repos): ___ +> 2. **GitHub EMU** (for enterprise repos, if applicable): ___" + +Offer to save to the developer config file. + +**Step 4: Not signed in at all:** +> "Please run: `gh auth login --hostname github.com`" + +## Repo Routing + +Use `modules` → `repositories` → `developer-local.json` to resolve dispatch details: + +```json +// orchestrator-config.json (committed, shared): +"repositories": { + "common-repo": { "slug": "org/common-repo", "host": "github", "baseBranch": "main" }, + "service-repo": { "slug": "enterprise-org/service-repo", "host": "github", "baseBranch": "dev" } +}, +"modules": { + "core": { "repo": "common-repo" }, + "service": { "repo": "service-repo" } +} + +// developer-local.json (per-developer, gitignored): +"github_accounts": { + "org/common-repo": "johndoe", + "enterprise-org/service-repo": "johndoe_microsoft" +} + +// Resolution: module "core" → repo "common-repo" → slug "org/common-repo" +// → gh account "johndoe" (from developer-local) → gh auth switch --user johndoe +``` + +## Workflow + +### 1. Read Work Items + +Read PBI details from ADO (via MCP) or from the chat context. Need: +- AB# ID +- Full description (Objective, Technical Requirements, Acceptance Criteria) +- Target repo module + +### 2. Check Dependencies + +For each work item, check if dependencies (other AB# IDs) have merged PRs. Skip blocked items. + +### 2a. Gather Cross-PBI Context for Dependencies + +For each work item that HAS dependencies on already-merged PBIs, enrich the dispatch +prompt with context about what those dependencies changed. This is critical — the coding +agent implementing PBI #3 needs to know what PBI #1 introduced. + +For each merged dependency: +```powershell +gh pr list --repo "" --search "AB#" --state merged --json number,title,files --jq '.[0]' +``` + +If a merged PR is found, add this block to the dispatch prompt: +``` +## Dependency Context + +This work item depends on already-merged changes: + +### AB#: (PR #, merged) +Files changed: +- (added/modified) +- (added/modified) + +Key APIs introduced: [extract from PR title/files — e.g., new classes, interfaces] +Build on these changes. Do NOT duplicate or re-implement what the dependency PR already added. +``` + +If the PR can't be found (no AB# match), skip gracefully — the PBI description is still self-contained. + +### 3. Switch Account + Dispatch + +For each ready work item: + +**Switch to correct account** (based on repo's `accountType` from config): +```powershell +gh auth switch --user +``` + +**Dispatch via `gh agent-task create`** (preferred, requires gh v2.80+): + +Write the full PBI description to a temp file to avoid shell escaping issues: +```powershell +$prompt = @" + +"@ +$prompt | Set-Content -Path "$env:TEMP\pbi-prompt.txt" +gh agent-task create (Get-Content "$env:TEMP\pbi-prompt.txt" -Raw) --repo "" --base +``` + +**IMPORTANT prompt content:** +- Include FULL PBI description (not truncated) +- Include `Fixes AB#` so PR links to ADO +- Include `Follow .github/copilot-instructions.md strictly` +- Do NOT include local file paths — agent can't access them + +**Fallback** (if `gh agent-task create` fails): Create a GitHub Issue and assign to Copilot. + +### 4. Update ADO State + +Mark the ADO work item as `Active`, add tag `agent-dispatched`. + +### 5. Report Summary + +```markdown +## Dispatch Summary + +| AB# | Repo | Method | Status | +|-----|------|--------|--------| +| AB#12345 | org/common-repo | agent-task | ✅ Dispatched | +| AB#12346 | org/service-repo | agent-task | ✅ Dispatched | +| AB#12347 | org/client-repo | ⏸ Blocked | Waiting on AB#12345 | + +### Next Step +> Say **"status"** to check agent PR progress. +> Use `@copilot` in PR comments to iterate with the coding agent. +``` + +## Review Feedback Loop + +After PRs are created, use `@copilot` in PR comments to iterate: +``` +@copilot Please add unit tests for the error case. +@copilot Use the Logger class instead of direct logging. +``` diff --git a/feature-orchestrator-plugin/skills/pr-validator/SKILL.md b/feature-orchestrator-plugin/skills/pr-validator/SKILL.md new file mode 100644 index 00000000..d6a02791 --- /dev/null +++ b/feature-orchestrator-plugin/skills/pr-validator/SKILL.md @@ -0,0 +1,151 @@ +--- +name: pr-validator +description: Validate an agent-created PR against its PBI acceptance criteria. Use during the Monitor phase to check whether a PR satisfies what was requested before human review. Triggers include "validate PR", "check PR quality", "does this PR match the spec". +--- + +# PR Validator + +Validate whether an agent-created PR satisfies its originating PBI's acceptance criteria +and follows project conventions. This runs during the Monitor phase — after the coding +agent creates a PR but before the human reviews it. + +## Purpose + +Save human review time by catching obvious gaps: +- Missing acceptance criteria +- Missing tests +- Convention violations that the agent should have followed +- Scope creep (changes beyond what was requested) + +**This is NOT a full code review.** It's a structured checklist that flags what to look at. + +## Inputs + +- PR number and repo slug +- The PBI that originated the PR (AB# ID or from feature state) + +## Process + +### Step 1: Gather PR Data + +```powershell +gh pr view --repo "" --json title,body,files,additions,deletions,commits,reviews,statusCheckRollup +``` + +Also get the diff stat: +```powershell +gh pr diff --repo "" --stat +``` + +### Step 2: Gather PBI Data + +Read the originating PBI's description from feature state or ADO: + +```powershell +$su = Join-Path $HOME ".feature-orchestrator" "state-utils.js" +node $su get-feature "" +``` + +Find the PBI that matches this PR (by repo + AB# reference in PR title/body). +Extract: +- **Acceptance Criteria** — the checklist from the PBI description +- **Files to Modify/Create** — expected file paths +- **Technical Requirements** — specific implementation guidance +- **Testing** — expected test coverage + +### Step 3: Acceptance Criteria Check + +For each acceptance criterion in the PBI: +1. Search the PR diff for evidence that it's addressed +2. Mark as: ✅ Addressed | ⚠️ Partially | ❌ Not found | ❓ Can't determine + +**How to check:** +- If the criterion mentions a specific behavior → look for code implementing it +- If it mentions a specific file → check if that file is in the PR's changed files +- If it mentions tests → check if test files are included +- If it's too abstract to verify from diff alone → mark ❓ + +### Step 4: File Coverage Check + +Compare the PBI's "Files to Modify/Create" against the PR's actual changed files: +- **Expected but not changed** → flag as potential gap +- **Changed but not expected** → flag as potential scope creep (may be fine — dependencies, imports) +- **New files created** → check naming conventions match the repo's patterns + +### Step 5: Convention Check + +Based on the repo's `.github/copilot-instructions.md` (which the agent should have followed), +spot-check: +- **Tests included?** If the PBI specified tests and no test files are in the diff → flag +- **Telemetry?** If the PBI mentioned telemetry/spans and no span-related code is visible → flag +- **Feature flag?** If the PBI mentioned a feature flag and none is visible → flag +- **License headers?** If new files were created, check for headers (don't read every file — just note if new files exist) + +**Do NOT** do a full code review. Don't check variable naming, code style, or logic correctness. +The human reviewer does that. Focus only on structural completeness. + +### Step 6: CI Status Check + +```powershell +gh pr checks --repo "" +``` + +Report: +- All passing → ✅ +- Some failing → list which checks failed +- Pending → note that CI is still running + +### Step 7: Present Report + +```markdown +## 🔍 PR Validation: # + +**PBI**: AB# +**Repo**: <slug> +**Changes**: +<additions> -<deletions> across <N> files + +### Acceptance Criteria + +| # | Criterion | Status | Evidence | +|---|-----------|--------|----------| +| 1 | [criterion text] | ✅ Addressed | [file or code reference] | +| 2 | [criterion text] | ⚠️ Partial | [what's missing] | +| 3 | [criterion text] | ❌ Not found | — | + +### File Coverage + +| Expected (from PBI) | In PR? | Notes | +|---------------------|--------|-------| +| path/to/File.java | ✅ | Modified | +| path/to/Test.java | ❌ | Not in diff — tests may be missing | + +**Unexpected changes**: [list files changed that weren't in the PBI, if any] + +### Convention Checks + +| Check | Status | +|-------|--------| +| Tests included | ✅ / ❌ | +| Telemetry spans | ✅ / ❌ / N/A | +| Feature flag gating | ✅ / ❌ / N/A | +| CI status | ✅ All passing / ❌ [failures] | + +### Summary + +**Overall**: 🟢 Looks good / 🟡 Review these gaps / 🔴 Significant gaps + +[1-2 sentence summary: what the human reviewer should focus on] +``` + +## When to Run + +- **Automatically**: When the Monitor phase detects a new PR from the coding agent +- **Manually**: When the user says "validate PR" or "check this PR" +- **On refresh**: When the dashboard refreshes PR status and a new open PR is found + +## Important Guidelines + +- **Speed over depth**: This should take <30 seconds. Don't read every line of code. +- **No false confidence**: If you can't verify a criterion from the diff, say ❓ not ✅ +- **Actionable output**: Every ❌ or ⚠️ should tell the human what to look for +- **Don't block**: This is informational. Even if gaps exist, the human decides whether to approve