ainl-compression
Standalone prompt compression primitives for AINL hosts and external Rust agents.
Why this crate exists
- Reusable outside ArmaraOS / OpenFang (
cargo add ainl-compression) - Minimal dependency surface
- Clear AINL ownership and attribution
Current scope
- Input prompt compression (
PromptCompressor) - Eco modes:
OffBalancedAggressive
- Natural-language mode parsing (
EfficientMode::parse_natural_language) - Structured telemetry (
CompressionMetrics)
Output/dense response compression is intentionally out-of-scope for now.
Basic usage
use ;
let compressor = new;
let compressed = compressor.compress;
println!;
Telemetry callback
use ;
let compressor = with_telemetry_callback;
let _ = compressor.compress;
Optional feature: graph-telemetry
Enable graph-telemetry when your host wants to serialize telemetry structures
for graph/event pipelines:
= { = "0.1.0-alpha", = ["graph-telemetry"] }
This adds serde derives for shared telemetry structs without coupling this crate to any specific graph/memory runtime implementation.
ArmaraOS integration model
- This crate stays runtime-agnostic.
- ArmaraOS/OpenFang can:
- persist aggregate metrics to
openfang-memory - attach turn-level telemetry into episodic trace metadata in graph memory
- persist aggregate metrics to
That keeps the crate externally reusable while still advancing unified graph execution tracing inside ArmaraOS.