Crate agcodex_core

Crate agcodex_core 

Source
Expand description

Root of the codex-core library.

Modules§

code_tools
Unified code tools scaffolding for AGCodex.
codex
config
config_profile
config_types
Types used to define the fields of crate::config::Config.
context_engine
AGCodex Context Engine scaffolding.
conversation
Conversation management utilities
embeddings
Independent embeddings system - completely separate from chat/LLM models.
embeddings_capability
Capability checks for embeddings providers. Per policy: OpenAI embeddings require an OpenAI API key. ChatGPT login (web tokens) is not sufficient and does not grant embeddings access.
error
exec
exec_env
git_info
landlock
model_family
models
modes
Operating modes for AGCodex (Plan, Build, Review). Minimal scaffolding to start the refactor without impacting existing flows.
parse_command
plan_tool
protocol
Defines the protocol for a Codex session between a client and an agent.
protocol_config_types
seatbelt
shell
spawn
subagents
Subagent system for AGCodex
tools
Internal agent tools for AGCodex
turn_diff_tracker
user_agent
util

Structs§

CodexConversation
ConversationManager
ConversationManager is responsible for creating conversations and maintaining them in memory.
ModelProviderInfo
Serializable representation of a provider definition.
NewConversation
Represents a newly created Codex conversation, including the first event (which is EventMsg::SessionConfigured).

Enums§

WireApi
Wire protocol that the provider speaks. Most third-party services only implement the classic OpenAI Chat Completions JSON schema, whereas OpenAI itself (and a handful of others) additionally expose the more modern Responses API. The two protocols use different request/response shapes and cannot be auto-detected at runtime, therefore each provider entry must declare which one it expects.

Constants§

BUILT_IN_OSS_MODEL_PROVIDER_ID
CODEX_APPLY_PATCH_ARG1

Functions§

built_in_model_providers
Built-in default provider list.
create_oss_provider_with_base_url
get_platform_sandbox