AI Contexters
Operator front door for agent session logs.
aicx is store-first:
-
Canonical corpus (
~/.aicx/) — extract, deduplicate, chunk, and store agent session logs as steerable markdown with frontmatter metadata. This is ground truth. Built by extractors (claude,codex,all) andstore. -
Retrieval surfaces — filesystem search, steering metadata, MCP tools, and a reusable native embedding library. AICX stays portable; heavy retrieval belongs to Roost/rust-memex.
aicx owns the canonical corpus and the portable local embedding foundation.
Roost/rust-memex owns the advanced retrieval/operator plane.
The operator-facing oracle contract is in docs/ORACLE_CORPUS.md: raw source
logs and the canonical corpus are truth; indexes are derived, rebuildable views
that must disclose fallback and Loctree scope safety.
Supported sources:
- Claude Code:
~/.claude/projects/*/*.jsonl - Codex:
~/.codex/history.jsonl - Gemini CLI:
~/.gemini/tmp/<hash>/chats/session-*.json - Gemini Antigravity direct extract:
~/.gemini/antigravity/conversations/<uuid>.pbor~/.gemini/antigravity/brain/<uuid>/
Install
Public install from GitHub Releases:
AICX_INSTALL_MODE=release AICX_RELEASE_TAG=v0.6.5
The installer downloads the matching release archive, verifies its adjacent
.sha256 sidecar, and installs both shipped commands:
From a local checkout:
install.sh installs aicx + aicx-mcp from the current checkout and configures Claude Code, Codex, and Gemini when their MCP settings directories already exist.
From a release bundle:
Bundle install copies prebuilt aicx + aicx-mcp into ~/.local/bin, removes stale user-local / cargo-installed copies, then refreshes MCP configuration.
No Rust toolchain and no local memex compilation are required on the target machine.
From an existing checkout, you can force the same release path:
AICX_INSTALL_MODE=release
AICX_INSTALL_MODE=release AICX_RELEASE_TAG=v0.6.6
Current release assets are slim unsigned .tar.gz bundles for macOS arm64,
Linux x64 GNU, and Linux arm64 GNU. The .sha256 sidecar is mandatory.
The npm wrapper track exists under distribution/npm/, but it is not the
supported v0.6.5 install path until its platform packages are aligned with the
current GitHub Release asset shape.
From an accessible GitHub repo when you want unreleased source:
Already installed the binaries?
Manual fallback:
install.sh prefers a colocated release bundle first, then a local checkout, and otherwise falls back to the published install path.
Maintainer-local signed bundle path on macOS:
That release path cleans target/<triple> after the bundle is safely written so
the self-hosted box does not keep hauling old release artifacts. Use CLEAN=0
when you explicitly want to keep the local build outputs.
Native embedder profiles:
base— F2LLM-v2 0.6BQ4_K_MGGUF, 1024 dims, about 397 MBdev— F2LLM-v2 1.7BQ4_K_MGGUF, 2048 dims, about 1.1 GBpremium— F2LLM-v2 1.7BQ6_KGGUF, 2048 dims, about 1.4 GB- Picker during install:
bash install.sh --pick-embedder
Config truth:
- AICX native embedder preferences live in
~/.aicx/embedder.tomlorAICX_EMBEDDER_CONFIG. - The picker writes
backend = "gguf",profile,repo, and exactfilename; model hydration is explicit. - Roost/rust-memex retrieval config remains separate, usually
~/.rmcp-servers/rust-memex/config.tomlorRUST_MEMEX_CONFIG. - Current public release bundles stay slim; they do not auto-bundle model weights.
Quickstart
aicx wizard opens the interactive entrypoint for browsing the corpus,
running doctor, watching store progress, and viewing the intents timeline.
Press ? inside the wizard for the keymap.
Library API
aicx is also a Rust library. The supported facade is aicx::Aicx;
callers do not need to import command-line internals from main.rs.
use *;
let client = from_env?;
let status = client.index_status?;
let chunks = client.list_chunks?;
For embedding AICX into another service, construct a handle with an explicit store root and write timeline entries directly:
use *;
let client = with_store_root;
let summary = client.store_entries?;
Layer 1 — build the canonical corpus
Extract the last 4 hours into ~/.aicx/. Extractors are quiet on stdout by default (--emit none).
-p/--project on extractors and store narrows source session discovery before
repo segmentation. One run can still resolve into multiple canonical repo buckets
or non-repository-contexts; --emit json makes that explicit through
requested_source_filters and resolved_store_buckets.
See what landed:
Surface contract:
aicx refsis the active CLI inventory command for canonical chunks.aicx readis the direct re-entry command for paths returned byrefs,search, dashboard chunk APIs, or MCP retrieval tools.- There is currently no
aicx rankCLI subcommand; ranking stays on the MCP surface asaicx_rank. aicx initis retired; framework bootstrap now lives in/vc-init.
Native local embeddings
AICX ships the reusable aicx-embeddings library behind the
native-embedder feature. The installed bundle does not carry model weights;
the picker can write config and optionally hydrate exactly one GGUF file:
The config file is plain TOML:
[]
= "gguf"
= "base"
= "mradermacher/F2LLM-v2-0.6B-GGUF"
= "F2LLM-v2-0.6B.Q4_K_M.gguf"
= false
= 512
Pipe one JSON payload (handy for automation):
|
|
What Gets Written Where
Layer 1 — canonical store (extractors, store)
~/.aicx/store/<organization>/<repository>/<YYYY_MMDD>/<kind>/<agent>/<YYYY_MMDD>_<agent>_<session-id>_<chunk>.md~/.aicx/non-repository-contexts/<YYYY_MMDD>/<kind>/<agent>/<YYYY_MMDD>_<agent>_<session-id>_<chunk>.md~/.aicx/index.json
Native embedder config
~/.aicx/embedder.toml— local GGUF backend/profile/repo/filename/path preference- HuggingFace cache snapshots under
~/.cache/huggingface/hub/
Framework-owned repo-local context artifacts (not written by the aicx CLI itself):
.ai-context/share/artifacts/SUMMARY.md.ai-context/share/artifacts/TIMELINE.md.ai-context/share/artifacts/TRIAGE.md
Store ignore contract:
- Optional
~/.aicx/.aicxignoreexcludes matching canonical chunk paths from steer indexing and downstream retrieval materialization. - Patterns are matched relative to
~/.aicx/using glob syntax, for example:
store/VetCoders/ai-contexters/**/reports/**
!store/VetCoders/ai-contexters/**/reports/2026_0406_codex_important_001.md
Common Workflows
Daily “what changed?” with incremental refresh plus compact summary:
Full-window backfill (ignore the stored watermark explicitly):
User-only mode (smaller output; excludes assistant + reasoning):
Steering retrieval (filter chunks by frontmatter metadata):
Direct chunk re-entry:
aicx search --json, aicx steer --json, aicx intents --emit json, and the
matching MCP tools include oracle_status. Fuzzy search is visibly marked as a
filesystem fallback, not semantic oracle readiness.
Native embedder hydration — picking the local model without bloating the bundle:
Heavy retrieval lives outside this CLI surface:
- Use Roost/rust-memex for advanced retrieval pipelines, provider routing, and operator-scale indexing.
- Keep Roost/rust-memex settings in its own config plane (
RUST_MEMEX_CONFIG, usually~/.rmcp-servers/rust-memex/config.toml). - Do not put the heavy retrieval provider config into
~/.aicx/embedder.toml; that file governs only AICX local embeddings.
Example Roost/rust-memex provider config:
[]
= 2560
[[]]
= "mlx-local"
= "http://127.0.0.1:1234"
= "qwen3-embedding-4b"
= 1
Single-session Gemini Antigravity extract (conversation artifacts first, explicit step-output fallback):
Review Vibecrafted workflow and marbles artifacts as a standalone dossier:
The generated HTML embeds the selected slice directly and can also import/export compatible JSON bundles client-side, so you can merge multiple workflow slices without standing up a server.
Local browsing now shares one surface:
# Static HTML artifact (default output: ~/.aicx/aicx-dashboard.html)
# Live local server
# Remote / Tailscale server with explicit CORS policy
The tailscale CORS preset accepts both tailnet IP origins and MagicDNS browser origins (*.ts.net).
Intent Taxonomy
The intent engine classifies stored chunks into 9 semantic types with typed link relations:
| Type | What it captures | Initial state |
|---|---|---|
intent |
User-expressed goal or proposal | proposed |
why |
Motivation behind a decision | active |
argue |
Multi-voice disagreement or trade-off | active |
decision |
Crystallized choice from discussion | active |
assumption |
Hypothesis treated as true until verified | proposed |
outcome |
Broad result of an action | done |
result |
Concrete measurable data point | done |
question |
Open knowledge gap | proposed |
insight |
Reframe backed by research evidence | active |
Link types: derived_from, supersedes, verifies, contradicts, supports, results_in, answers, links_to.
State transitions: Proposed → Active → Done/Superseded/Contradicted. Session-level post-processing detects unresolved intents (no outcome after 7 days), supersedes chains (newer entry on same topic), contradicted assumptions (result + failure signal), and insight sourcing (DerivedFrom links to research chunks).
Docs
docs/ARCHITECTURE.md(module map + data flows)docs/COMMANDS.md(exact CLI reference + examples)docs/STORE_LAYOUT.md(store + framework-owned.ai-context/layouts)docs/REDACTION.md(secret redaction, regex engine notes)docs/SOURCE_PROTECTION.md(opt-in local source protection and sharing policy)docs/DISTILLATION.md(chunking/distillation model + tuning ideas)docs/RELEASES.md(release/distribution workflow + maintainer checklist)
Notes
- Secrets are redacted by default on corpus-building commands (
claude,codex,all,extract,store). Disable only if you know what you’re doing:--no-redact-secrets. - Framework integration expects
aicxoraicx-mcpinPATH. - Native embedding models are never downloaded silently by package install or MCP startup.
Vibecrafted with AI Agents by VetCoders (c)2026 VetCoders