copex 1.0.6

Copilot Extended - AI assistant CLI with chat, plan, and ralph modes
copex-1.0.6 is not a library.

Copex - Copilot Extended

Crates.io License: MIT

A fast, native CLI for GitHub Copilot with extended features: plan-and-execute workflows, Ralph Wiggum loops, and tool execution.

Features

  • 🚀 Native Performance - Written in Rust for speed and reliability
  • 💬 Chat Mode - Interactive conversations with AI models
  • 📋 Plan Mode - Multi-step task planning with automatic execution
  • 🔁 Ralph Mode - Iterative AI loops that keep trying until success
  • 🛠️ Tool Execution - Create files, view content, run bash commands
  • 🎨 Rich UI - Syntax highlighting, streaming output, themes
  • 💾 Session Management - Save and restore conversation history
  • 🔌 MCP Support - Connect to external tool servers

Installation

From crates.io:

cargo install copex

From source:

git clone https://github.com/Arthur742Ramos/copex
cd copex
cargo build --release
./target/release/copex --version

Prerequisites

Quick Start

Chat - Simple Conversations

# Quick question
copex chat "Explain async/await in Rust"

# With specific model
copex chat "Write a binary search" --model gpt-5.2-codex

# With reasoning
copex chat "Complex problem" --model claude-opus-4.5 --reasoning high

Plan - Multi-Step Workflows

# Generate and execute a plan
copex plan "Build a REST API with tests" --execute

# Review plan before executing
copex plan "Refactor auth module" --review

# Save plan for later
copex plan "Large feature" --output plan.json

# Resume from step 3
copex plan --load plan.json --execute --from-step 3

Ralph - Iterative Until Success

# Keep trying until all tests pass
copex ralph "Fix the failing tests" --max-iterations 30

# With completion promise
copex ralph "Build feature X" --promise "ALL TESTS PASSING"

Interactive Mode

copex interactive

Session Management

# List sessions
copex session list

# Save current session
copex session save my-project

# Resume session
copex session load my-project

Configuration

Create ~/.config/copex/config.toml:

default_model = "gpt-5.2-codex"
default_reasoning = "high"
ui_theme = "monokai"

[retry]
max_retries = 5
base_delay_ms = 1000

Models

# List available models
copex models

# Use specific model
copex chat "..." --model claude-opus-4.5
copex chat "..." --model gpt-5.2-codex
copex chat "..." --model gemini-3-pro-preview

Reasoning Levels

  • none - No extended thinking
  • low - Light reasoning
  • medium - Balanced (default)
  • high - Deep reasoning
  • xhigh - Maximum reasoning (supported by some models)

Migration from Python

This is a complete rewrite of copex from Python to Rust. All features are preserved with improved performance.

The Python version is archived in archive/python-v1.1.0/.

License

MIT