claude-code-proxy 0.3.0

OpenAI-compatible API proxy for Claude Code CLI
claude-code-proxy-0.3.0 is not a library.

claude-code-proxy

CI License: CC BY-NC-SA 4.0 Rust

Note on Anthropic ToS: On Feb 19, 2026, Anthropic updated their Consumer ToS to ban extracting OAuth tokens for use in third-party tools. This proxy does not extract or forward tokens — it spawns claude --print as a subprocess, which is an officially supported programmatic use pattern (same as scripting, piping, or cron jobs). The CLI manages its own authentication internally. This tool is intended for personal, single-user, localhost automation only.

OpenAI-compatible API proxy for Claude Code CLI. Uses your authenticated Claude Code (Max subscription) for inference — no API keys needed.

Why

Claude Code CLI is powerful but only works in the terminal. This proxy exposes it as an OpenAI-compatible HTTP API, enabling:

  • OpenClaw to use Claude as its LLM backend
  • Cursor, Continue, and other AI coding tools to connect to Claude Code
  • Any OpenAI-compatible client to use your Max subscription

Architecture

Client (OpenClaw/Cursor/etc.)
    │
    ▼  POST /v1/chat/completions or /v1/responses
claude-code-proxy (this binary)
    │
    ▼  claude --print --model opus --output-format stream-json
Claude Code CLI (uses Max subscription)
    │
    ▼
Anthropic API

Endpoints

Method Path Description
GET /health Health check
GET /v1/models List available models
POST /v1/chat/completions Chat Completions API (streaming + non-streaming)
POST /v1/responses Responses API (streaming + non-streaming)

All endpoints also available without the /v1 prefix.

Usage

PROXY_API_KEY=your-secret claude-code-proxy

Then point your client to http://localhost:8080/v1.

Environment Variables

Variable Required Default Description
PROXY_API_KEY Yes Bearer token for proxy authentication
PORT No 8080 Listen port
CLAUDE_MODEL No sonnet Default model (haiku, sonnet, opus)
RUST_LOG No claude_proxy=info Log level

Examples

# Non-streaming
curl -H "Authorization: Bearer your-secret" \
     -H "Content-Type: application/json" \
     -d '{"model":"opus","messages":[{"role":"user","content":"hello"}]}' \
     http://localhost:8080/v1/chat/completions

# Streaming (SSE)
curl -H "Authorization: Bearer your-secret" \
     -H "Content-Type: application/json" \
     -d '{"model":"opus","messages":[{"role":"user","content":"hello"}],"stream":true}' \
     http://localhost:8080/v1/chat/completions

# Responses API
curl -H "Authorization: Bearer your-secret" \
     -H "Content-Type: application/json" \
     -d '{"model":"opus","input":"hello"}' \
     http://localhost:8080/v1/responses

OpenClaw Configuration

{
  "models": {
    "providers": {
      "claude-code-proxy": {
        "api": "openai-completions",
        "baseUrl": "http://127.0.0.1:8080/v1",
        "apiKey": "your-secret",
        "models": [
          { "id": "opus", "name": "Claude Opus" }
        ]
      }
    }
  }
}

Features

  • Both Chat Completions and Responses API formats
  • Full SSE streaming on all endpoints
  • Content blocks normalization (handles [{"type":"text","text":"..."}] and plain strings)
  • Constant-time auth comparison (subtle crate)
  • 1MB request body limit
  • kill_on_drop child process management (no zombies)
  • Proper UTF-8 validation on CLI output
  • Model normalization (claude-sonnet-4.5sonnet)

Development

# Enter devenv shell
direnv allow

# Build
dev-build

# Run
dev-run

# Test (44 tests including property-based)
dev-test

NixOS

This proxy is designed to run as a systemd user service via mynixos:

my.ai.claudeProxy = {
  enable = true;
  model = "opus";
};

License

Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA) 4.0 International

See LICENSE for details.