Files
librenotes/.wave/pipelines/doc-loop.yaml
Michael Czechowski fc24f9a8ab Add Wave general-purpose pipelines
ADR, changelog, code-review, debug, doc-sync, explain, feature,
hotfix, improve, onboard, plan, prototype, refactor, security-scan,
smoke-test, speckit-flow, supervise, test-gen, and more.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 17:02:36 +01:00

252 lines
8.5 KiB
YAML

kind: WavePipeline
metadata:
name: doc-loop
description: Pre-PR documentation consistency gate — scans changes, cross-references docs, and creates a GitHub issue with inconsistencies
release: false
input:
source: cli
example: "full"
schema:
type: string
description: "Scan scope: empty for branch diff, 'full' for all files, or a git ref"
steps:
- id: scan-changes
persona: navigator
workspace:
type: worktree
branch: "{{ pipeline_id }}"
exec:
type: prompt
source: |
Scan the repository to identify changed files and capture the current documentation state.
## Determine Scan Scope
Input: {{ input }}
- If the input is empty or blank: use `git log --name-status main...HEAD` to find files changed on the current branch vs main.
- If the input is "full": skip the diff — treat ALL files as in-scope and scan all documentation.
- Otherwise, treat the input as a git ref and use `git log --name-status <input>...HEAD`.
Run `git log --oneline --name-status` with the appropriate range to get the list of changed files.
If no commits are found (e.g. on main with no branch divergence), fall back to `git status --porcelain` for uncommitted changes.
## Categorize Changed Files
Sort each changed file into one of these categories:
- **source_code**: source files matching the project language (excluding test files)
- **tests**: test files (files with test/spec in name or in test directories)
- **documentation**: markdown files, doc directories, README, CONTRIBUTING, CHANGELOG
- **configuration**: config files, schema files, environment configs
- **build**: build scripts, CI/CD configs, Makefiles, Dockerfiles
- **other**: everything else
## Read Documentation Surface Area
Discover and read key documentation files. Common locations include:
- Project root: README.md, CONTRIBUTING.md, CHANGELOG.md
- Documentation directories: docs/, doc/, wiki/
- Configuration docs: any files documenting config options or environment variables
- CLI/API docs: any files documenting commands, endpoints, or public interfaces
Adapt your scan to the actual project structure — do not assume a fixed layout.
## Output
Write your findings as structured JSON.
Include:
- scan_scope: mode ("diff" or "full"), range used, base_ref
- changed_files: total_count + categories object with arrays of file paths
- documentation_snapshot: array of {path, exists, summary} for each doc file
- timestamp: current ISO 8601 timestamp
output_artifacts:
- name: scan-results
path: .wave/output/scan-results.json
type: json
handover:
contract:
type: json_schema
source: .wave/output/scan-results.json
schema_path: .wave/contracts/doc-scan-results.schema.json
on_failure: retry
max_retries: 2
- id: analyze-consistency
persona: reviewer
dependencies: [scan-changes]
memory:
inject_artifacts:
- step: scan-changes
artifact: scan-results
as: scan
workspace:
type: worktree
branch: "{{ pipeline_id }}"
exec:
type: prompt
source: |
Analyze documentation consistency by cross-referencing code changes with documentation.
## Cross-Reference Checks
For each category of changed files, perform these checks:
**CLI/API surface** (changed command or endpoint files):
- Compare command definitions, endpoints, or public interfaces against documentation
- Check for new, removed, or changed options/parameters
- Verify documented examples still work
**Configuration** (changed config schemas or parsers):
- Compare documented options against actual config structure
- Check for undocumented settings or environment variables
**Source code** (changed source files):
- Check for new exported functions/types that might need API docs
- Look for stale code comments referencing removed features
- Verify public API descriptions in docs match actual behavior
**Environment variables**:
- Scan source code for environment variable access patterns
- Compare against documentation
- Flag undocumented environment variables
## Severity Rating
Rate each inconsistency:
- **CRITICAL**: Feature exists in code but completely missing from docs, or docs describe non-existent feature
- **HIGH**: Incorrect information in docs (wrong flag name, wrong description, wrong behavior)
- **MEDIUM**: Outdated information (stale counts, missing new options, incomplete lists)
- **LOW**: Minor style issues, slightly imprecise wording
## Output
Write your analysis as structured JSON.
Include:
- summary: total_count, by_severity counts, clean (true if zero inconsistencies)
- inconsistencies: array of {id (DOC-001 format), severity, category, title, description, source_location, doc_location, fix_description}
- timestamp: current ISO 8601 timestamp
If no inconsistencies are found, output an empty array with clean=true.
output_artifacts:
- name: consistency-report
path: .wave/output/consistency-report.json
type: json
handover:
contract:
type: json_schema
source: .wave/output/consistency-report.json
schema_path: .wave/contracts/doc-consistency-report.schema.json
on_failure: retry
max_retries: 2
- id: compose-report
persona: navigator
dependencies: [analyze-consistency]
memory:
inject_artifacts:
- step: analyze-consistency
artifact: consistency-report
as: report
workspace:
type: worktree
branch: "{{ pipeline_id }}"
exec:
type: prompt
source: |
Compose a documentation consistency report as a GitHub-ready markdown file.
## Check for Inconsistencies
If the consistency report has `summary.clean == true` (zero inconsistencies):
- Write a short "No inconsistencies found" message as the report
- Write the issue result with skipped=true and reason="clean"
## Compose the Report
Write the report as markdown:
```
## Documentation Consistency Report
**Scan date**: <timestamp from report>
**Inconsistencies found**: <total_count>
### Summary by Severity
| Severity | Count |
|----------|-------|
| Critical | N |
| High | N |
| Medium | N |
| Low | N |
### Task List
For each inconsistency (sorted by severity, critical first):
- [ ] **[DOC-001]** (CRITICAL) Title here — `doc_location`
Fix: fix_description
---
*Generated by [Wave](https://github.com/re-cinq/wave) doc-loop pipeline*
```
output_artifacts:
- name: report
path: .wave/output/report.md
type: markdown
- id: publish
persona: craftsman
dependencies: [compose-report]
memory:
inject_artifacts:
- step: compose-report
artifact: report
as: report
workspace:
type: worktree
branch: "{{ pipeline_id }}"
exec:
type: prompt
source: |
PUBLISH — create a GitHub issue from the documentation report.
If the report says "No inconsistencies found", skip issue creation and exit.
## Detect Repository
Run: `gh repo view --json nameWithOwner --jq .nameWithOwner`
## Create the Issue
```bash
gh issue create \
--title "docs: documentation consistency report" \
--body-file .wave/artifacts/report \
--label "documentation"
```
If the `documentation` label doesn't exist, create without labels.
If any `gh` command fails, log the error and continue.
## Capture Result
Write a JSON status report.
output_artifacts:
- name: issue-result
path: .wave/output/issue-result.json
type: json
handover:
contract:
type: json_schema
source: .wave/output/issue-result.json
schema_path: .wave/contracts/doc-issue-result.schema.json
must_pass: true
on_failure: retry
max_retries: 2
outcomes:
- type: issue
extract_from: .wave/output/issue-result.json
json_path: .issue_url
label: "Documentation Issue"