Skip to main content

Crate noos

Crate noos 

Source
Expand description

§Noos — Reliability infrastructure for Rust LLM agents

Regulator sits between your agent’s retry loop and your LLM. Every turn, you emit a handful of events — user message, LLM response, tokens spent, quality signal when you have one. Regulator returns a Decision: Continue, ScopeDriftWarn, CircuitBreak, ProceduralWarning, or LowConfidenceSpans. Your loop branches on the variant and keeps moving.

Nothing in Noos wraps your LLM client. There is no framework lock-in and no runtime dependency on a specific model. The event surface is a single enum your code owns.

§Quick start

use noos::{Decision, LLMEvent, Regulator};

let mut regulator = Regulator::for_user("alice").with_cost_cap(2_000);

regulator.on_event(LLMEvent::TurnStart {
    user_message: "Refactor fetch_user to be async".into(),
});

// ... call your LLM of choice ...

regulator.on_event(LLMEvent::TurnComplete {
    full_response: response_text,
});
regulator.on_event(LLMEvent::Cost {
    tokens_in, tokens_out, wallclock_ms, provider: None,
});

match regulator.decide() {
    Decision::Continue => { /* send response to user */ }
    Decision::CircuitBreak { .. } => { /* halt the retry loop */ }
    _ => { /* handle warning variants */ }
}

See docs/regulator-guide.md for the full event contract, decision handling recipes, and gotchas; docs/app-contract.md for the semantic contract between Noos and your application.

§Advanced: direct cognitive-session access

Underneath Regulator runs session::CognitiveSession, a pipeline producing continuous signals (conservation, confidence, strategy recommendation, gain mode) plus delta-modulation output for local Mamba/SSM inference (requires the candle feature flag).

Most integrations do not need this layer. Use CognitiveSession directly only if you need raw continuous signals for a custom policy or are running local Mamba inference (perplexity −1.86 % on emotional text, 3 runs bit-identical).

Re-exports§

pub use regulator::CircuitBreakReason;
pub use regulator::ConfidenceSpan;
pub use regulator::CorrectionPattern;
pub use regulator::Decision;
pub use regulator::LLMEvent;
pub use regulator::Regulator;
pub use regulator::RegulatorState;
pub use regulator::correction::CorrectionStore;
pub use regulator::cost::CostAccumulator;
pub use regulator::scope::ScopeTracker;
pub use regulator::token_stats::TokenStatsAccumulator;
pub use errors::NoosError;
pub use errors::NoosResult;
pub use types::belief::AffectState;
pub use types::belief::AffectValence;
pub use types::belief::SharedBeliefState;
pub use types::gate::GateContext;
pub use types::gate::GateResult;
pub use types::gate::GateType;
pub use types::intervention::CognitiveSignals;
pub use types::intervention::CognitiveState;
pub use types::intervention::DeltaModulation;
pub use types::intervention::ForwardResult;
pub use types::intervention::HiddenStateStats;
pub use types::intervention::InterventionDepth;
pub use types::intervention::LayerTarget;
pub use types::intervention::SamplingOverride;
pub use types::world::GainMode;
pub use types::world::LearnedState;
pub use types::world::WorldModel;
pub use cognition::convergence::converge;
pub use cognition::convergence::ConvergenceContext;
pub use cognition::convergence::ConvergenceResult;
pub use cognition::delta_modulation::compute_delta_modulation;
pub use cognition::intervention::build_cognitive_state;
pub use cognition::intervention::compute_sampling_override;
pub use cognition::signals::compute_signals;
pub use memory::retrieval::hybrid_recall;
pub use memory::retrieval::ActivatedAtom;
pub use memory::retrieval::ActivationSource;
pub use memory::retrieval::RecallOptions;
pub use memory::store::AtomUpdate;
pub use memory::store::InMemoryStore;
pub use memory::store::MemoryStore;
pub use types::memory::AtomSource;
pub use types::memory::AtomType;
pub use types::memory::MemoryAtom;
pub use types::memory::Synapse;
pub use types::memory::SynapseType;

Modules§

ai
AI provider abstraction (Phase 4).
cognition
errors
Noos error types — structured errors with context (P5: fail-open ≠ swallow).
inference
kernel
Kernel — plugin system + pipeline infrastructure (Phase 4).
math
memory
regulator
Regulator — reliability layer for LLM agent loops (Path 2).
session
Cognitive Session — the public API for applications.
types