prodex
One OpenAI profile pool for Codex CLI and Claude Code.
prodex gives you two entry points backed by the same OpenAI account pool:
| Use case | Command |
|---|---|
| Run Codex CLI through Prodex | prodex or prodex run |
| Run Claude Code through Prodex | prodex claude |
It keeps each profile isolated, checks quota before launch, and rotates to another ready account before a request or stream is committed.
Use prodex when Codex CLI is your front end. Use prodex claude when Claude Code is your front end. The account pool, profile isolation, quota checks, and continuation routing stay in Prodex either way.
Requirements
- An OpenAI account, plus at least one logged-in Prodex profile
- Codex CLI if you want to use
prodex - Claude Code (
claude) if you want to useprodex claude
If you install @christiandoxa/prodex from npm, the Codex runtime dependency is installed for you. Claude Code is still a separate CLI and should already be available on your PATH when you use prodex claude.
Install
Install from npm:
Or install from crates.io:
The npm package version is kept in lockstep with the published crate version.
Update
Check your installed version:
The current local version in this repo is 0.2.109:
If you want to switch from a Cargo-installed binary to npm:
Start
Import your current login:
Or create a profile through the normal Codex login flow:
Check the pool:
Use prodex for Codex CLI
prodex without a subcommand is shorthand for prodex run.
|
Use this path when you want Codex CLI itself to be the front end. Prodex handles profile selection, quota preflight, continuation affinity, and safe pre-commit rotation across your OpenAI-backed profiles.
Use prodex claude for Claude Code
Use this path when you want Claude Code to be the front end while Prodex still routes requests through the same OpenAI-backed profile pool.
prodex clauderuns Claude Code through a local Anthropic-compatible proxy- Claude Code state is isolated per profile in
CLAUDE_CONFIG_DIR - the initial Claude model follows the shared Codex
config.tomlmodel when available - Claude's
opus,sonnet, andhaikupicker entries are pinned to representative GPT models - Claude
maxeffort maps to OpenAIxhighwhen the selected GPT model supports it - Claude Code itself only exposes built-in aliases plus one custom model option on third-party providers
- use
PRODEX_CLAUDE_BINifclaudeis not onPATH - use
PRODEX_CLAUDE_MODELto force a specific upstream Responses model - use
PRODEX_CLAUDE_REASONING_EFFORTto force the upstream reasoning tier
Example:
PRODEX_CLAUDE_MODEL=gpt-5.2 PRODEX_CLAUDE_REASONING_EFFORT=xhigh
Common Commands
More
For a slightly longer walkthrough, see QUICKSTART.md.