SCUD CLI (Rust)
Fast, simple task master for AI-driven development - Rust implementation.
Overview
This is a high-performance Rust rewrite of the SCUD task management system. It replaces the external task-master CLI with a fast, single-binary solution that:
- ⚡ 50x faster startup time (~10ms vs ~500ms)
- 🎯 42x token reduction (~500 tokens vs ~21k tokens per operation)
- 📦 Simple distribution - single binary, no dependencies
- 🔧 Direct LLM integration - no MCP overhead
Architecture
scud (Rust Binary)
├── Core Commands (No AI - Instant)
│ ├── init # Initialize .scud/ and install agents
│ ├── tags # List tags
│ ├── use-tag # Switch active tag
│ ├── list # List tasks with filters
│ ├── view # Open interactive HTML viewer in browser
│ ├── show # Show task details
│ ├── set-status # Update task status
│ ├── next # Find next available task (--claim for dynamic-wave)
│ ├── stats # Show statistics
│ └── doctor # [EXPERIMENTAL] Diagnose stuck states
│
├── AI Commands (Direct Anthropic API)
│ ├── parse-prd # Parse PRD markdown into tasks
│ ├── analyze-complexity # Analyze task complexity
│ ├── expand # Break down complex tasks
│ └── research # AI-powered research
│
└── Storage (SCG)
└── .scud/tasks/tasks.scg
Installation
Option 1: npm (Recommended)
Install globally via npm - this will build the Rust binary automatically:
Requirements: Rust toolchain must be installed (rustup.rs)
Option 2: Cargo
Install directly via Cargo:
Or build from source:
Verify Installation
Building (Development)
Debug Build
Release Build
Usage
Core Commands
# Initialize SCUD
# List tags
# Switch to a tag
# List tasks
# Show task details
# Update task status
# Find next available task
# Show statistics
# Open interactive task viewer in browser
[EXPERIMENTAL] Dynamic-Wave Mode
Dynamic-wave mode allows agents to auto-claim tasks and maintain workflow health:
# Find and auto-claim the next available task
# Release all tasks claimed by an agent
IMPORTANT: When using --claim, agents MUST run scud set-status <id> done when finishing a task. This ensures dependent tasks become unblocked.
[EXPERIMENTAL] Doctor Command
Diagnose stuck workflow states:
# Check for issues in all tags
# Check specific tag with custom stale threshold
# Auto-fix recoverable issues (stale locks, orphan tasks)
The doctor command detects:
- Stale locks (tasks locked >24h by default)
- Tasks blocked by cancelled/missing dependencies
- Orphan in-progress tasks (not locked, stale)
- Missing active tag
- Corrupt storage files
AI Commands
Requires: API key environment variable (see Provider Configuration)
# Parse PRD into tasks
# Analyze complexity
# Expand complex tasks
# Research a topic
Performance Comparison
| Operation | Old (task-master) | New (Rust) | Improvement |
|---|---|---|---|
| Startup | ~500ms | ~10ms | 50x faster |
| List tasks | ~100ms | ~5ms | 20x faster |
| Parse PRD | ~3-5s | ~2-3s | ~40% faster |
| Token overhead | ~21k | ~500 | 42x reduction |
Provider Configuration
SCUD supports multiple LLM providers: xAI (Grok), Anthropic (Claude), OpenAI (GPT), and OpenRouter.
Quick Start
# Initialize with xAI (Grok) - recommended for fast code generation
# Or initialize with Anthropic (Claude)
# Interactive mode - prompt for provider
Configuration File
The configuration is stored in .scud/config.toml:
[]
= "xai"
= "grok-code-fast-1"
= 4096
For complete provider documentation, see PROVIDERS.md.
Supported Providers
| Provider | Environment Variable | Default Model |
|---|---|---|
| xAI | XAI_API_KEY |
grok-code-fast-1 |
| Anthropic | ANTHROPIC_API_KEY |
claude-sonnet-4-20250514 |
| OpenAI | OPENAI_API_KEY |
gpt-4-turbo |
| OpenRouter | OPENROUTER_API_KEY |
anthropic/claude-sonnet-4 |
Data Models
Task
Phase
Config
[]
= "xai"
= "grok-code-fast-1"
= 4096
LLM Integration
Direct Anthropic API
- No MCP server overhead
- Simple HTTP requests
- Minimal token usage
- Fast response times
Prompt Templates
Located in src/llm/prompts.rs:
parse_prd()- Converts markdown to structured tasksanalyze_complexity()- Scores task difficultyexpand_task()- Breaks down complex tasksresearch_topic()- AI research assistant
Integration with SCUD
The Rust CLI integrates seamlessly with the existing SCUD system:
bin/scud.jsdetects and delegates to Rust binary- Falls back to debug build if release not available
- Auto-builds if binary not found
- All agents and slash commands work unchanged
Development
Project Structure
scud-cli/
├── Cargo.toml
├── src/
│ ├── main.rs # CLI entry point
│ ├── commands/
│ │ ├── mod.rs
│ │ ├── init.rs # Core commands
│ │ ├── tags.rs
│ │ ├── ...
│ │ └── ai/ # AI commands
│ │ ├── parse_prd.rs
│ │ ├── analyze_complexity.rs
│ │ ├── expand.rs
│ │ └── research.rs
│ ├── models/
│ │ ├── task.rs
│ │ └── phase.rs
│ ├── storage/
│ │ └── mod.rs # JSON I/O
│ └── llm/
│ ├── client.rs # Anthropic API
│ └── prompts.rs # Prompt templates
Adding New Commands
- Add command to
Commandsenum inmain.rs - Create handler in
src/commands/ - Add module to
src/commands/mod.rs - Update help text
Adding New LLM Prompts
- Add prompt function to
src/llm/prompts.rs - Create command handler in
src/commands/ai/ - Use
LLMClient::complete()orcomplete_json()
Testing
# Build and test
# Test specific command
# Test AI commands (requires API key)
Distribution
As Standalone Binary
# Binary: target/release/scud
# Copy to /usr/local/bin or similar
Via npm Package
The npm package builds the Rust binary during installation:
- Runs
cargo build --releaseduringnpm install - Binary is placed in
bin/directory bin/scud.jsis a thin wrapper that executes the binary- Requires Rust toolchain to be installed
Future Enhancements
- Cross-compilation for multiple platforms
- Pre-built binaries in npm package (eliminate Rust requirement)
- Task export/import
- Custom prompt templates
- Integration tests with real API calls
License
MIT
Contributing
See main SCUD repository for contribution guidelines.