kiromi-ai-cli 0.2.2

Operator and developer CLI for the kiromi-ai-memory store: append, search, snapshot, regenerate, migrate-scheme, gc, audit-tail.
kiromi-ai-cli-0.2.2 is not a library.

kiromi-ai-cli

Operator + developer CLI for kiromi-ai-memory. Ships the kiromi-ai binary.

cargo install kiromi-ai-cli                       # default: no bundled embedder
cargo install kiromi-ai-cli --features embed-onnx # with fastembed-rs / multilingual-e5-small

Highlights

# initialise a store
kiromi-ai --no-embedder init \
    --storage local:./store --metadata sqlite:./store/metadata.db \
    --tenant local --scheme 'user={user}/topic={topic}'

# append + search
kiromi-ai append --partition 'user=alex/topic=design' \
    --body-file ./note.md --embedding-file ./vec.json
kiromi-ai search 'design rationale' --mode hybrid --top-k 8

# snapshot / restore
kiromi-ai snapshot --tag pre-import
kiromi-ai restore <snapshot-id> --json

# regeneration / migration / GC
kiromi-ai regenerate-embeddings --new-dim 768 --dry-run
kiromi-ai migrate-scheme --to 'user={user}/topic={topic}/year={year}' \
    --mapper json:./mapper.json --dry-run
kiromi-ai gc --json

# context builder for an LLM prompt
kiromi-ai context --focus memory:01J... --budget 16000

Default install — no embedder

The default install does not bundle the ONNX runtime. Every append and search either takes --embedding/--embedding-file (caller-provided vectors) or runs against a store opened with the mock family. Operators wanting on-device embeddings opt in via --features embed-onnx.

Status

0.1.0. JSON output flags (--json) on the data-touching subcommands are stable and consumed by the docs / Swift recipe.

License

Dual-licensed under Apache-2.0 OR MIT.

Documentation