laminae 0.4.1

The missing layer between raw LLMs and production AI — personality, safety, sandboxing, containment
Documentation

laminae

The missing layer between raw LLMs and production AI.

Meta-crate that re-exports all Laminae layers. Add this one dependency to get the full stack.

Installation

[dependencies]
laminae = "0.4"
tokio = { version = "1", features = ["full"] }

Or pick individual layers:

laminae-psyche = "0.4"    # Multi-agent cognitive pipeline
laminae-persona = "0.4"   # Voice extraction & enforcement
laminae-cortex = "0.4"    # Self-improving learning loop
laminae-shadow = "0.4"    # Adversarial red-teaming
laminae-glassbox = "0.4"  # I/O containment
laminae-ironclad = "0.4"  # Process sandbox
laminae-ollama = "0.4"    # Ollama client

The Layers

Layer Module What It Does
Psyche laminae::psyche Id + Superego shape the Ego's response with invisible context
Persona laminae::persona Voice extraction from samples, style enforcement, AI phrase detection
Cortex laminae::cortex Tracks user edits, detects patterns, learns reusable instructions
Shadow laminae::shadow Automated security auditing of AI output
Ironclad laminae::ironclad Command whitelist, network sandbox, resource watchdog
Glassbox laminae::glassbox Input/output validation, rate limiting, path protection

Plus laminae::ollama for local LLM inference.

Quick Example

use laminae::psyche::{PsycheEngine, EgoBackend};
use laminae::ollama::OllamaClient;

struct MyEgo;

impl EgoBackend for MyEgo {
    fn complete(&self, _system: &str, user_msg: &str, _ctx: &str)
        -> impl std::future::Future<Output = anyhow::Result<String>> + Send
    {
        let msg = user_msg.to_string();
        async move { Ok(format!("Response to: {msg}")) }
    }
}

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let engine = PsycheEngine::new(OllamaClient::new(), MyEgo);
    let response = engine.reply("What is creativity?").await?;
    println!("{response}");
    Ok(())
}

See the examples for Claude API, OpenAI API, Shadow auditing, and full-stack integration.

License

Apache-2.0 - Copyright 2026 Orel Ohayon.