codexia
Rust gateway that logs in with OpenAI Codex OAuth and exposes OpenAI- and Anthropic-compatible APIs.
Usage
login prints the Codex OAuth URL. Complete the login in a browser, then paste
the full redirected URL from the browser address bar, for example
http://localhost:1455/auth/callback?code=...&state=.... This matches
OpenClaw's remote/headless fallback and does not require the gateway host to be
reachable from the public internet.
OpenAI-compatible chat request:
Anthropic-compatible Messages request:
Claude Code / Anthropic SDK setup:
ANTHROPIC_BASE_URL should point at the Codexia server root, not /v1,
because Anthropic clients append /v1/messages themselves. Streaming emits
Anthropic-style message_delta stop_reason and cumulative
usage.output_tokens.
Full Claude Code flow:
-
Log in and save Codex OAuth credentials:
-
Start Codexia with a supported model list and a local API key:
-
Point Claude Code at the local gateway:
-
Run Claude Code against a model that Codexia exposes:
For non-interactive validation, this works:
ANTHROPIC_BASE_URL=http://127.0.0.1:14550 \
ANTHROPIC_AUTH_TOKEN=local-secret \
Common pitfalls:
- Do not set
ANTHROPIC_BASE_URLtohttp://127.0.0.1:14550/v1; Claude Code appends/v1/messagesitself. - Use a model that
/v1/modelsactually returns, such asgpt-5.5. If Claude Code defaults toclaude-sonnet-*, the request will fail because Codexia proxies Codex models, not Anthropic-hosted model IDs. ANTHROPIC_AUTH_TOKENis only the local gateway key configured with--api-key; it is not your upstream OpenAI/Codex OAuth token.- If you prefer a background service, install the daemon first and then point
ANTHROPIC_BASE_URLat the daemon address instead of runningcodexia servemanually.
Optional local API key protection:
CODEXIA_API_KEY=local-secret
Interactive runtime configuration:
The config file is stored at ~/.codexia/config.json by default and is used as
the fallback source for codexia serve and codexia daemon install.
Manually refresh the stored Codex OAuth token while the server is running:
Check token expiry, account metadata, and available rate-limit windows:
Fetch the same status data over HTTP:
Example response:
Install Codexia as a per-user background daemon:
On macOS, Codexia installs a LaunchAgent at
~/Library/LaunchAgents/com.codexia.daemon.plist. On Linux, it installs a
systemd user unit at ~/.config/systemd/user/codexia.service.
The daemon runs codexia serve with the options passed at install time:
Models returned by /v1/models default to OpenClaw's openai-codex registry:
gpt-5.1
gpt-5.1-codex-max
gpt-5.1-codex-mini
gpt-5.2
gpt-5.2-codex
gpt-5.3-codex
gpt-5.3-codex-spark
gpt-5.4
gpt-5.4-mini
gpt-5.5
gpt-5.5-mini
Credentials are stored at ~/.codexia/auth.json by default. Override with
--auth-file, CODEXIA_AUTH_FILE, or CODEXIA_HOME.
Anthropic compatibility currently covers:
POST /v1/messagesPOST /v1/messages/count_tokensx-api-keyorauthorization: Bearer ...local auth- Anthropic-style SSE events for streaming text and tool use
The implementation intentionally follows Ollama's compatibility strategy where possible: Anthropic headers are accepted, locally configured auth is enforced, and unsupported advanced Anthropic-only features are ignored rather than rejected when possible.
The OAuth flow follows OpenClaw/pi-ai's Codex flow: PKCE, manual paste of the
http://localhost:1455/auth/callback?... redirect URL, token exchange at
https://auth.openai.com/oauth/token, and Codex requests to
https://chatgpt.com/backend-api/codex/responses.
Disclaimer
Codexia is an unofficial tool and is not affiliated with, endorsed by, or supported by OpenAI. Use it at your own risk and make sure your usage complies with the terms that apply to your account and the upstream services.
License
Copyright (c) 2026 Codexia contributors.
Licensed under the GNU Lesser General Public License v3.0 only. See LICENSE.