# Ralphloop
<div align="center">
[](https://www.rust-lang.org)
[](LICENSE)
[](https://crates.io/crates/ralphloop)
_A powerful CLI tool for creating and running Ralphiloops with adaptive LLM integration_
</div>
## ๐ Overview
Ralphloop is a Rust-based command-line tool that enables you to create and execute **Ralphiloops** - structured, iterative workflows that leverage Large Language Models (LLMs) to automate complex reasoning tasks. Whether you're doing code reviews, content creation, research, or problem-solving, Ralphloop provides a flexible framework to build repeatable AI-powered workflows.
### โจ Key Features
- ๐ค **Multi-Provider Support**: Works with OpenAI, Anthropic, Google Gemini, and local LLMs
- ๐ **Template System**: Pre-built templates for common workflows
- ๐ง **Configuration Management**: Secure, flexible configuration with multiple profiles
- ๐ฏ **Variable Management**: Dynamic variable substitution and context passing
- โก **Async Execution**: Fast, non-blocking execution with proper error handling
- ๐ก๏ธ **Type Safety**: Built with Rust for memory safety and performance
- ๐ฆ **Easy Distribution**: Single binary distribution with no runtime dependencies
## ๐ฆ Installation
### From Crates.io (Recommended)
```bash
cargo install ralphloop
```
### From Source
```bash
git clone https://github.com/yingkitw/ralphloop.git
cd ralphloop
cargo build --release
```
The binary will be available at `target/release/ralphloop`.
## ๐ Quick Start
### 1. Configure Your LLM Provider
First, set up your preferred LLM provider:
```bash
# OpenAI
ralph configure --provider openai --api-key your-openai-api-key --model gpt-3.5-turbo
# Anthropic Claude
ralph configure --provider anthropic --api-key your-anthropic-api-key --model claude-3-sonnet-20240229
# Local LLM (Ollama)
ralph configure --provider local --model llama2 --local-endpoint http://localhost:11434
```
### 2. Create Your First Ralph Loop
Create a loop from a template:
```bash
# Create a code review loop
ralph create my-code-review --template code-review
```
Or create a custom loop manually:
```bash
ralph create my-custom-loop
```
### 3. Run Your Loop
```bash
# Run once
ralph run my-code-review.json
# Run multiple iterations
ralph run my-code-review.json --iterations 5
```
## ๐ Usage Guide
### Configuration
#### Setting Up Providers
Each LLM provider requires specific configuration:
**OpenAI**
```bash
ralph configure --provider openai --api-key sk-... --model gpt-4
```
**Anthropic**
```bash
ralph configure --provider anthropic --api-key sk-ant-... --model claude-3-opus-20240229
```
**Google Gemini**
```bash
ralph configure --provider gemini --api-key your-gemini-api-key --model gemini-pro
```
**Local LLM (Ollama/Llama.cpp)**
```bash
ralph configure --provider local --model llama2 --local-endpoint http://localhost:11434
```
#### Multiple Profiles
You can create multiple configuration profiles:
```bash
# Create a development profile
ralph configure --profile dev --provider openai --api-key dev-key --model gpt-3.5-turbo
# Use a specific profile
ralph --profile dev run my-loop.json
```
### Loop Creation
#### Using Templates
Ralph CLI comes with several built-in templates:
```bash
ralph create my-content --template content-creation
ralph create my-research --template research
ralph create my-story --template creative-writing
ralph create my-solution --template problem-solving
```
#### Available Templates
| `code-review` | Automated code analysis and improvement suggestions | Development workflow |
| `content-creation` | Structured content generation with refinement | Blog posts, documentation |
| `research` | Systematic information gathering and synthesis | Academic or market research |
| `creative-writing` | Story development with narrative structure | Creative projects |
| `problem-solving` | Structured approach to complex problems | Decision making |
#### Custom Loop Structure
Ralphiloops are defined in JSON format:
```json
{
"name": "my-loop",
"description": "A sample Ralphiloop",
"steps": [
{
"name": "analyze",
"prompt_template": "Analyze: {{input}}",
"output_variable": "analysis",
"depends_on": ["input"]
},
{
"name": "summarize",
"prompt_template": "Summarize: {{analysis}}",
"output_variable": "summary",
"depends_on": ["analysis"]
}
],
"context": {},
"variables": {
"input": "Your content here"
}
}
```
### Loop Execution
#### Basic Execution
```bash
ralph run my-loop.json
```
#### Advanced Options
```bash
# Run multiple iterations
ralph run my-loop.json --iterations 5
# Run with specific profile
ralph --profile production run my-loop.json
# Dry run (show prompts without executing)
ralph run my-loop.json --dry-run
```
#### Variable Substitution
Ralph CLI supports dynamic variable substitution:
- `{{variable}}`: Use variables from the `variables` section
- `{{context.key}}`: Use context from previous step outputs
- `{{iteration}}`: Current iteration number (for multi-iteration runs)
- `{{timestamp}}`: Current timestamp
## ๐๏ธ Architecture
### Core Components
```
ralphloop/
โโโ src/
โ โโโ main.rs # CLI entry point and command handling
โ โโโ ralph_loop.rs # Core loop structure and execution engine
โ โโโ llm.rs # LLM provider abstractions and implementations
โ โโโ config.rs # Configuration management
โ โโโ templates.rs # Built-in templates
โโโ examples/ # Example loops and configurations
```
### Key Concepts
1. **Ralph Loop**: A structured workflow with steps that execute sequentially based on dependencies
2. **Step**: Individual unit of work that calls an LLM with a specific prompt
3. **Variable**: Named values that can be passed between steps
4. **Template**: Predefined loop structure for common use cases
5. **Provider**: LLM service implementation (OpenAI, Anthropic, etc.)
## ๐งช Examples
### Code Review Loop
```json
{
"name": "code-review",
"description": "Automated code review workflow",
"steps": [
{
"name": "analyze_code",
"prompt_template": "Analyze this code for security issues:\n\n{{code}}",
"output_variable": "security_analysis",
"depends_on": ["code"]
},
{
"name": "suggest_improvements",
"prompt_template": "Based on: {{security_analysis}}\n\nSuggest improvements.",
"output_variable": "improvements",
"depends_on": ["security_analysis"]
}
],
"variables": {
"code": "fn main() { println!(\"Hello\"); }"
}
}
```
### Content Creation Loop
```json
{
"name": "blog-post",
"description": "Blog post creation workflow",
"steps": [
{
"name": "outline",
"prompt_template": "Create outline for: {{topic}}\nTarget: {{audience}}",
"output_variable": "outline",
"depends_on": ["topic", "audience"]
},
{
"name": "draft",
"prompt_template": "Write draft based on: {{outline}}",
"output_variable": "draft",
"depends_on": ["outline"]
},
{
"name": "refine",
"prompt_template": "Improve this draft: {{draft}}",
"output_variable": "final_post",
"depends_on": ["draft"]
}
],
"variables": {
"topic": "Benefits of Rust programming",
"audience": "Software developers"
}
}
```
## ๐ง Configuration
### Configuration File Location
- **macOS**: `~/Library/Application Support/com.ralph.ralphloop/config.json`
- **Linux**: `~/.local/share/com.ralph.ralphloop/config.json`
- **Windows**: `%APPDATA%\com.ralph\ralphloop\config.json`
### Environment Variables
You can override configuration with environment variables:
```bash
export RALPH_API_KEY=your-api-key
export RALPH_MODEL=gpt-4
export RALPH_PROVIDER=openai
ralph run my-loop.json
```
## ๐งช Testing
### Running Tests
```bash
# Run all tests
cargo test
# Run with coverage
cargo tarpaulin --out Html
# Run integration tests
cargo test --test integration
```
### Mock Mode
For testing without API calls:
```bash
ralph configure --provider mock
ralph run my-loop.json # Will use mock responses
```
## ๐ค Contributing
We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.
### Development Setup
```bash
git clone https://github.com/yourusername/ralphloop.git
cd ralphloop
cargo build
cargo test
```
### Code Style
```bash
# Format code
cargo fmt
# Run clippy
cargo clippy -- -D warnings
```
## ๐ License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## ๐ Acknowledgments
- The Rust community for excellent tooling and libraries
- OpenAI, Anthropic, and other LLM providers for their APIs
- The concept of "Ralphiloops" inspired by iterative AI workflows
## ๐ Additional Resources
- [Architecture Documentation](ARCHITECTURE.md)
- [Technical Specification](SPEC.md)
- [Development Roadmap](TODO.md)
- [Examples Repository](https://github.com/yourusername/ralph-examples)
## ๐ Support
- ๐ [Documentation](https://ralphloop.github.io/docs)
- ๐ [Issue Tracker](https://github.com/yourusername/ralphloop/issues)
- ๐ฌ [Discussions](https://github.com/yourusername/ralphloop/discussions)
- ๐ง [Email Support](mailto:support@ralphloop.com)
---
<div align="center">
Made with โค๏ธ by the Ralphloop community
</div>