ralphloop 0.1.0

A CLI tool for creating and running Ralphloops with LLM integration
ralphloop-0.1.0 is not a library.

Ralphloop

Rust License Crates.io

A powerful CLI tool for creating and running Ralphiloops with adaptive LLM integration

๐Ÿš€ Overview

Ralphloop is a Rust-based command-line tool that enables you to create and execute Ralphiloops - structured, iterative workflows that leverage Large Language Models (LLMs) to automate complex reasoning tasks. Whether you're doing code reviews, content creation, research, or problem-solving, Ralphloop provides a flexible framework to build repeatable AI-powered workflows.

โœจ Key Features

  • ๐Ÿค– Multi-Provider Support: Works with OpenAI, Anthropic, Google Gemini, and local LLMs
  • ๐Ÿ“ Template System: Pre-built templates for common workflows
  • ๐Ÿ”ง Configuration Management: Secure, flexible configuration with multiple profiles
  • ๐ŸŽฏ Variable Management: Dynamic variable substitution and context passing
  • โšก Async Execution: Fast, non-blocking execution with proper error handling
  • ๐Ÿ›ก๏ธ Type Safety: Built with Rust for memory safety and performance
  • ๐Ÿ“ฆ Easy Distribution: Single binary distribution with no runtime dependencies

๐Ÿ“ฆ Installation

From Crates.io (Recommended)

cargo install ralphloop

From Source

git clone https://github.com/yingkitw/ralphloop.git
cd ralphloop
cargo build --release

The binary will be available at target/release/ralphloop.

๐Ÿš€ Quick Start

1. Configure Your LLM Provider

First, set up your preferred LLM provider:

# OpenAI
ralph configure --provider openai --api-key your-openai-api-key --model gpt-3.5-turbo

# Anthropic Claude
ralph configure --provider anthropic --api-key your-anthropic-api-key --model claude-3-sonnet-20240229

# Local LLM (Ollama)
ralph configure --provider local --model llama2 --local-endpoint http://localhost:11434

2. Create Your First Ralph Loop

Create a loop from a template:

# Create a code review loop
ralph create my-code-review --template code-review

Or create a custom loop manually:

ralph create my-custom-loop

3. Run Your Loop

# Run once
ralph run my-code-review.json

# Run multiple iterations
ralph run my-code-review.json --iterations 5

๐Ÿ“– Usage Guide

Configuration

Setting Up Providers

Each LLM provider requires specific configuration:

OpenAI

ralph configure --provider openai --api-key sk-... --model gpt-4

Anthropic

ralph configure --provider anthropic --api-key sk-ant-... --model claude-3-opus-20240229

Google Gemini

ralph configure --provider gemini --api-key your-gemini-api-key --model gemini-pro

Local LLM (Ollama/Llama.cpp)

ralph configure --provider local --model llama2 --local-endpoint http://localhost:11434

Multiple Profiles

You can create multiple configuration profiles:

# Create a development profile
ralph configure --profile dev --provider openai --api-key dev-key --model gpt-3.5-turbo

# Use a specific profile
ralph --profile dev run my-loop.json

Loop Creation

Using Templates

Ralph CLI comes with several built-in templates:

ralph create my-content --template content-creation
ralph create my-research --template research
ralph create my-story --template creative-writing
ralph create my-solution --template problem-solving

Available Templates

Template Description Use Case
code-review Automated code analysis and improvement suggestions Development workflow
content-creation Structured content generation with refinement Blog posts, documentation
research Systematic information gathering and synthesis Academic or market research
creative-writing Story development with narrative structure Creative projects
problem-solving Structured approach to complex problems Decision making

Custom Loop Structure

Ralphiloops are defined in JSON format:

{
  "name": "my-loop",
  "description": "A sample Ralphiloop",
  "steps": [
    {
      "name": "analyze",
      "prompt_template": "Analyze: {{input}}",
      "output_variable": "analysis",
      "depends_on": ["input"]
    },
    {
      "name": "summarize",
      "prompt_template": "Summarize: {{analysis}}",
      "output_variable": "summary",
      "depends_on": ["analysis"]
    }
  ],
  "context": {},
  "variables": {
    "input": "Your content here"
  }
}

Loop Execution

Basic Execution

ralph run my-loop.json

Advanced Options

# Run multiple iterations
ralph run my-loop.json --iterations 5

# Run with specific profile
ralph --profile production run my-loop.json

# Dry run (show prompts without executing)
ralph run my-loop.json --dry-run

Variable Substitution

Ralph CLI supports dynamic variable substitution:

  • {{variable}}: Use variables from the variables section
  • {{context.key}}: Use context from previous step outputs
  • {{iteration}}: Current iteration number (for multi-iteration runs)
  • {{timestamp}}: Current timestamp

๐Ÿ—๏ธ Architecture

Core Components

ralphloop/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ main.rs          # CLI entry point and command handling
โ”‚   โ”œโ”€โ”€ ralph_loop.rs    # Core loop structure and execution engine
โ”‚   โ”œโ”€โ”€ llm.rs          # LLM provider abstractions and implementations
โ”‚   โ”œโ”€โ”€ config.rs       # Configuration management
โ”‚   โ””โ”€โ”€ templates.rs    # Built-in templates
โ””โ”€โ”€ examples/           # Example loops and configurations

Key Concepts

  1. Ralph Loop: A structured workflow with steps that execute sequentially based on dependencies
  2. Step: Individual unit of work that calls an LLM with a specific prompt
  3. Variable: Named values that can be passed between steps
  4. Template: Predefined loop structure for common use cases
  5. Provider: LLM service implementation (OpenAI, Anthropic, etc.)

๐Ÿงช Examples

Code Review Loop

{
  "name": "code-review",
  "description": "Automated code review workflow",
  "steps": [
    {
      "name": "analyze_code",
      "prompt_template": "Analyze this code for security issues:\n\n{{code}}",
      "output_variable": "security_analysis",
      "depends_on": ["code"]
    },
    {
      "name": "suggest_improvements",
      "prompt_template": "Based on: {{security_analysis}}\n\nSuggest improvements.",
      "output_variable": "improvements",
      "depends_on": ["security_analysis"]
    }
  ],
  "variables": {
    "code": "fn main() { println!(\"Hello\"); }"
  }
}

Content Creation Loop

{
  "name": "blog-post",
  "description": "Blog post creation workflow",
  "steps": [
    {
      "name": "outline",
      "prompt_template": "Create outline for: {{topic}}\nTarget: {{audience}}",
      "output_variable": "outline",
      "depends_on": ["topic", "audience"]
    },
    {
      "name": "draft",
      "prompt_template": "Write draft based on: {{outline}}",
      "output_variable": "draft",
      "depends_on": ["outline"]
    },
    {
      "name": "refine",
      "prompt_template": "Improve this draft: {{draft}}",
      "output_variable": "final_post",
      "depends_on": ["draft"]
    }
  ],
  "variables": {
    "topic": "Benefits of Rust programming",
    "audience": "Software developers"
  }
}

๐Ÿ”ง Configuration

Configuration File Location

  • macOS: ~/Library/Application Support/com.ralph.ralphloop/config.json
  • Linux: ~/.local/share/com.ralph.ralphloop/config.json
  • Windows: %APPDATA%\com.ralph\ralphloop\config.json

Environment Variables

You can override configuration with environment variables:

export RALPH_API_KEY=your-api-key
export RALPH_MODEL=gpt-4
export RALPH_PROVIDER=openai
ralph run my-loop.json

๐Ÿงช Testing

Running Tests

# Run all tests
cargo test

# Run with coverage
cargo tarpaulin --out Html

# Run integration tests
cargo test --test integration

Mock Mode

For testing without API calls:

ralph configure --provider mock
ralph run my-loop.json  # Will use mock responses

๐Ÿค Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

git clone https://github.com/yourusername/ralphloop.git
cd ralphloop
cargo build
cargo test

Code Style

# Format code
cargo fmt

# Run clippy
cargo clippy -- -D warnings

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • The Rust community for excellent tooling and libraries
  • OpenAI, Anthropic, and other LLM providers for their APIs
  • The concept of "Ralphiloops" inspired by iterative AI workflows

๐Ÿ“š Additional Resources

๐Ÿ†˜ Support


Made with โค๏ธ by the Ralphloop community

โญ Star us on GitHub | ๐Ÿฆ Follow on Twitter