| Crates.io | prodigy |
| lib.rs | prodigy |
| version | 0.4.4 |
| created_at | 2025-09-01 19:20:40.965681+00 |
| updated_at | 2025-12-27 00:17:20.016437+00 |
| description | Turn ad-hoc Claude sessions into reproducible development pipelines with parallel AI agents |
| homepage | |
| repository | https://github.com/iepathos/prodigy |
| max_upload_size | |
| id | 1820145 |
| size | 13,597,626 |
Transform ad-hoc Claude sessions into reproducible development pipelines with parallel execution, automatic retry, and full state management.
cargo install prodigy
# Clone the repository
git clone https://github.com/iepathos/prodigy
cd prodigy
# Build and install
cargo build --release
cargo install --path .
# Optional: Install man pages
./scripts/install-man-pages.sh
Get up and running in under 5 minutes with these simple examples.
prodigy init
fix-tests.yml):name: fix-failing-tests
steps:
- shell: "cargo test"
on_failure:
claude: "/fix-test-failures"
max_attempts: 3
prodigy run fix-tests.yml
Process multiple files simultaneously with MapReduce:
name: add-documentation
mode: mapreduce
setup:
- shell: "find src -name '*.rs' -type f > files.json"
map:
input: files.json
agent_template:
- claude: "/add-rust-docs ${item}"
max_parallel: 10
reduce:
- claude: "/summarize Documentation added to ${map.successful} files"
Run with:
prodigy run add-documentation.yml
# Run a workflow
prodigy run workflow.yml
# Execute a single command with retries
prodigy exec "claude: /refactor main.rs" --retry 3
# Process files in parallel
prodigy batch "*.py" --command "claude: /add-types" --parallel 5
# Resume an interrupted workflow
prodigy resume workflow-123
# View analytics and costs
prodigy analytics --session abc123
# Manage worktrees (all workflow executions use isolated git worktrees by default)
prodigy worktree ls # List active worktrees
prodigy worktree ls --detailed # Show enhanced session information
prodigy worktree ls --json # Output in JSON format
prodigy worktree ls --detailed --json # Combine detailed info with JSON output
prodigy worktree clean # Clean up inactive worktrees
Prodigy supports reusable workflow templates that can be registered, shared, and invoked with parameters. This enables building a library of common workflows and patterns.
# Initialize a template directory
prodigy template init templates/
# Register a template from a file
prodigy template register my-workflow.yml \
--name refactor-pipeline \
--description "Refactoring workflow with tests" \
--version 1.0.0 \
--tags refactor,testing \
--author "Your Name"
# List all registered templates
prodigy template list
prodigy template list --long # Show detailed information
prodigy template list --tag refactor # Filter by tag
# Show template details
prodigy template show refactor-pipeline
# Search for templates
prodigy template search "refactor"
prodigy template search testing --by-tag
# Validate a template file
prodigy template validate my-workflow.yml
# Delete a template
prodigy template delete refactor-pipeline --force
Templates are standard workflow files with parameter definitions:
name: refactor-with-tests
description: Refactor code and ensure tests pass
version: 1.0.0
parameters:
target:
description: File or directory to refactor
type: string
required: true
test_command:
description: Command to run tests
type: string
default: "cargo test"
commands:
- claude: "/refactor ${target}"
- shell: "${test_command}"
on_failure:
claude: "/fix-test-failures"
max_attempts: 3
Pass parameters via CLI flags or parameter files:
# Pass parameters individually
prodigy run refactor-pipeline.yml \
--param target=src/main.rs \
--param test_command="cargo test --all"
# Pass parameters from a JSON file
prodigy run refactor-pipeline.yml \
--param-file params.json
# Pass parameters from a YAML file
prodigy run refactor-pipeline.yml \
--param-file params.yaml
Parameter file example (params.json):
{
"target": "src/main.rs",
"test_command": "cargo test --all",
"timeout": 300
}
Parameter file example (params.yaml):
target: src/main.rs
test_command: cargo test --all
timeout: 300
Parameter Type Inference:
true/false) are parsed correctlyParameter Substitution:
Parameters are available in commands using ${parameter_name} syntax:
${target} - Direct parameter reference${test_command} - Parameter with default valueretry_defaults:
attempts: 3
backoff: exponential
initial_delay: 2s
max_delay: 30s
jitter: true
steps:
- shell: "deploy.sh"
retry:
attempts: 5
backoff:
fibonacci:
initial: 1s
retry_on: [network, timeout]
retry_budget: 5m
env:
NODE_ENV: production
WORKERS:
command: "nproc"
cache: true
secrets:
API_KEY: ${vault:api/keys/production}
steps:
- shell: "npm run build"
env:
BUILD_TARGET: production
working_dir: ./frontend
imports:
- path: ./common/base.yml
alias: base
templates:
test-suite:
parameters:
- name: language
type: string
steps:
- shell: "${language} test"
workflows:
main:
extends: base.default
steps:
- use: test-suite
with:
language: cargo
Prodigy automatically tracks git changes during workflow execution and provides context variables for accessing file changes, commits, and statistics:
${step.files_added} - Files added in the current step${step.files_modified} - Files modified in the current step${step.files_deleted} - Files deleted in the current step${step.files_changed} - All files changed (added + modified + deleted)${step.commits} - Commit hashes created in the current step${step.commit_count} - Number of commits in the current step${step.insertions} - Lines inserted in the current step${step.deletions} - Lines deleted in the current step${workflow.files_added} - All files added across the workflow${workflow.files_modified} - All files modified across the workflow${workflow.files_deleted} - All files deleted across the workflow${workflow.files_changed} - All files changed across the workflow${workflow.commits} - All commit hashes across the workflow${workflow.commit_count} - Total commits across the workflow${workflow.insertions} - Total lines inserted across the workflow${workflow.deletions} - Total lines deleted across the workflowVariables support pattern filtering using glob patterns:
# Get only markdown files added
- shell: "echo '${step.files_added:*.md}'"
# Get only Rust source files modified
- claude: "/review ${step.files_modified:*.rs}"
# Get specific directory changes
- shell: "echo '${workflow.files_changed:src/*}'"
Control output format with modifiers:
# JSON array format
- shell: "echo '${step.files_added:json}'" # ["file1.rs", "file2.rs"]
# Newline-separated (for scripts)
- shell: "echo '${step.files_added:lines}'" # file1.rs\nfile2.rs
# Comma-separated
- shell: "echo '${step.files_added:csv}'" # file1.rs,file2.rs
# Space-separated (default)
- shell: "echo '${step.files_added}'" # file1.rs file2.rs
name: code-review-workflow
steps:
# Make changes
- claude: "/implement feature X"
commit_required: true
# Review only the changed Rust files
- claude: "/review-code ${step.files_modified:*.rs}"
# Generate changelog for markdown files
- shell: "echo 'Changed docs:' && echo '${step.files_added:*.md:lines}'"
# Conditional execution based on changes
- shell: "cargo test"
when: "${step.files_modified:*.rs}" # Only run if Rust files changed
# Summary at the end
- claude: |
/summarize-changes
Total files changed: ${workflow.files_changed:json}
Commits created: ${workflow.commit_count}
Lines added: ${workflow.insertions}
Lines removed: ${workflow.deletions}
The write_file command allows workflows to create files with content, supporting multiple formats with validation and automatic formatting.
Basic Syntax:
- write_file:
path: "output/results.txt"
content: "Processing complete!"
format: text # text, json, or yaml
mode: "0644" # Unix permissions (default: 0644)
create_dirs: false # Create parent directories (default: false)
Supported Formats:
- write_file:
path: "logs/build.log"
content: "Build started at ${timestamp}"
format: text
- write_file:
path: "output/results.json"
content: '{"status": "success", "items_processed": ${map.total}}'
format: json
create_dirs: true
- write_file:
path: "config/settings.yml"
content: |
environment: production
server:
port: 8080
host: localhost
format: yaml
Variable Interpolation:
All fields support variable interpolation:
# In MapReduce map phase
- write_file:
path: "output/${item.name}.json"
content: '{"id": "${item.id}", "processed": true}'
format: json
create_dirs: true
# In reduce phase
- write_file:
path: "summary.txt"
content: "Processed ${map.total} items, ${map.successful} successful"
format: text
Security Features:
..)Common Use Cases:
reduce:
- write_file:
path: "results/summary.json"
content: '{"total": ${map.total}, "successful": ${map.successful}, "failed": ${map.failed}}'
format: json
- write_file:
path: ".config/app.yml"
content: |
name: ${PROJECT_NAME}
version: ${VERSION}
features:
- authentication
- caching
format: yaml
- write_file:
path: "scripts/deploy.sh"
content: |
#!/bin/bash
echo "Deploying ${APP_NAME}"
./deploy.sh --env production
mode: "0755"
create_dirs: true
Prodigy supports multi-step validation and error recovery with two formats:
Array Format (for simple command sequences):
validate:
- shell: "prep-command-1"
- shell: "prep-command-2"
- claude: "/validate-result"
Object Format (when you need metadata like threshold, max_attempts, etc.):
validate:
commands:
- shell: "prep-command-1"
- shell: "prep-command-2"
- claude: "/validate-result"
result_file: "validation-results.json"
threshold: 75 # Validation must score at least 75/100
on_incomplete:
commands:
- claude: "/fix-gaps --gaps ${validation.gaps}"
- shell: "rebuild-and-revalidate.sh"
max_attempts: 3
fail_workflow: false
Key Points:
threshold, result_file, max_attempts, or fail_workflowthreshold and max_attempts belong at the config level, not on individual commandson_incomplete supports the same two formats (array or object with commands:)Example: Multi-step validation workflow
- claude: "/implement-feature spec.md"
commit_required: true
validate:
commands:
- shell: "cargo test"
- shell: "cargo clippy"
- claude: "/validate-implementation spec.md"
result_file: ".prodigy/validation.json"
threshold: 90
on_incomplete:
commands:
- claude: "/fix-issues --gaps ${validation.gaps}"
- shell: "cargo test"
max_attempts: 5
fail_workflow: true
Prodigy looks for configuration in these locations (in order):
.prodigy/config.yml - Project-specific configuration~/.config/prodigy/config.yml - User configuration/etc/prodigy/config.yml - System-wide configurationExample configuration:
# .prodigy/config.yml
claude:
model: claude-3-opus
max_tokens: 4096
worktree:
max_parallel: 20
cleanup_policy:
idle_timeout: 300
max_age: 3600
retry:
default_attempts: 3
default_backoff: exponential
storage:
events_dir: ~/.prodigy/events
state_dir: ~/.prodigy/state
Fix all test failures automatically with intelligent retry:
name: test-pipeline
steps:
- shell: "cargo test"
on_failure:
- claude: "/analyze-test-failure ${shell.output}"
- claude: "/fix-test-failure"
- shell: "cargo test"
retry:
attempts: 3
backoff: exponential
- shell: "cargo fmt -- --check"
on_failure: "cargo fmt"
- shell: "cargo clippy -- -D warnings"
on_failure:
claude: "/fix-clippy-warnings"
Analyze and improve multiple files concurrently:
name: parallel-analysis
mode: mapreduce
setup:
- shell: |
find . -name "*.rs" -exec wc -l {} + |
sort -rn |
head -20 |
awk '{print $2}' > complex-files.json
map:
input: complex-files.json
agent_template:
- claude: "/analyze-complexity ${item}"
- claude: "/suggest-refactoring ${item}"
- shell: "cargo test --lib $(basename ${item} .rs)"
max_parallel: 10
reduce:
- claude: "/generate-refactoring-report ${map.results}"
- shell: "echo 'Analyzed ${map.total} files, ${map.successful} successful'"
๐ Full documentation is available at https://iepathos.github.io/prodigy
Quick links:
# Install mdBook
cargo install mdbook
# Serve with live reload
mdbook serve book --open
| Command | Description |
|---|---|
prodigy run <workflow> |
Execute a workflow |
prodigy exec <command> |
Run a single command |
prodigy batch <pattern> |
Process files in parallel |
prodigy resume <id> |
Resume interrupted workflow |
prodigy analytics |
View session analytics |
prodigy worktree |
Manage git worktrees |
prodigy init |
Initialize Prodigy in project |
prodigy run workflow.yml --max-parallel 20
prodigy run workflow.yml -v
Note: The -v flag also enables Claude streaming JSON output for debugging Claude interactions.
prodigy analytics --session <session-id>
Prodigy automatically creates checkpoints. To resume:
# List available checkpoints
prodigy checkpoints list
# Resume from latest checkpoint
prodigy resume
# Resume specific workflow
prodigy resume workflow-abc123
Review and reprocess failed items:
# View failed items
prodigy dlq view <job-id>
# Reprocess failed items
prodigy dlq retry <job-id> --max-parallel 5
Check configuration precedence:
# Show effective configuration
prodigy config show
# Validate configuration
prodigy config validate
Install man pages manually:
cd prodigy
./scripts/install-man-pages.sh
# Or install to user directory
./scripts/install-man-pages.sh --user
Enable debug logging:
# Set log level
export RUST_LOG=debug
prodigy run workflow.yml -vv
# View detailed events
prodigy events --job-id <job-id> --verbose
Prodigy provides fine-grained control over Claude interaction visibility:
Default behavior (no flags):
prodigy run workflow.yml
# Shows progress and results, but no Claude JSON streaming output
Verbose mode (-v):
prodigy run workflow.yml -v
# Shows Claude streaming JSON output for debugging interactions
Debug mode (-vv) and trace mode (-vvv):
prodigy run workflow.yml -vv
prodigy run workflow.yml -vvv
# Also shows Claude streaming output plus additional internal logs
Force Claude output (environment override):
PRODIGY_CLAUDE_CONSOLE_OUTPUT=true prodigy run workflow.yml
# Shows Claude streaming output regardless of verbosity level
This allows you to keep normal runs clean while enabling detailed debugging when needed.
We welcome contributions! Please see our Contributing Guide for details.
# Fork and clone the repository
git clone https://github.com/YOUR-USERNAME/prodigy
cd prodigy
# Set up development environment
cargo build
cargo test
# Run with verbose output
RUST_LOG=debug cargo run -- run test.yml
# Before submitting PR
cargo fmt
cargo clippy -- -D warnings
cargo test
Prodigy is licensed under MIT. See LICENSE for details.
Prodigy builds on the shoulders of giants:
Special thanks to all contributors who have helped make Prodigy better!
Made with โค๏ธ by developers, for developers
Features โข Quick Start โข Docs โข Contributing
# Test merge