Skip to main content

rusty_genius/
lib.rs

1//! # Rusty-Genius: The Nervous System for AI
2//!
3//! **A high-performance, modular, local-first AI orchestration library written in Rust.**
4//!
5//! Rusty-Genius is built for **on-device orchestration**, prioritizing absolute privacy, zero latency,
6//! and offline reliability. It decouples protocol, orchestration, engine, and tooling to provide a
7//! flexible foundation for modern AI applications.
8//!
9//! ## Architecture
10//!
11//! The project follows a biological metaphor, where each component serves a specific function in the "nervous system":
12//!
13//! - **Genius** (this crate): The Public Facade. Re-exports internal crates and provides the primary user API.
14//! - **Brainstem** ([`brainstem`]): The Orchestrator. Manages the central event loop, engine lifecycle (TTL), and state transitions.
15//! - **Cortex** ([`cortex`]): The Muscle. Provides direct bindings to `llama.cpp` for inference, handling KV caching and token streaming.
16//! - **Facecrab** ([`facecrab`]): The Supplier. An autonomous asset authority that handles model resolution (HuggingFace), registry management, and downloads.
17//! - **Core** ([`core`]): The Shared Vocabulary. Contains protocol enums, manifests, and error definitions.
18//!
19//! ## Quick Start
20//!
21//! The most robust way to use Rusty-Genius is via the [`Orchestrator`]. It manages the background event loop,
22//! model lifecycle (loading/unloading), and hardware stubs.
23//!
24//! ```no_run
25//! use rusty_genius::Orchestrator;
26//! use rusty_genius::core::protocol::{AssetEvent, BrainstemInput, BrainstemOutput, InferenceEvent};
27//! use futures::{StreamExt, sink::SinkExt, channel::mpsc};
28//!
29//! #[async_std::main]
30//! async fn main() -> Result<(), Box<dyn std::error::Error>> {
31//!     // 1. Initialize the orchestrator (with default 5m TTL)
32//!     let mut genius = Orchestrator::new().await?;
33//!     let (mut input, rx) = mpsc::channel(100);
34//!     let (tx, mut output) = mpsc::channel(100);
35//!
36//!     // Spawn the Brainstem event loop in a background task
37//!     async_std::task::spawn(async move {
38//!         if let Err(e) = genius.run(rx, tx).await {
39//!             eprintln!("Orchestrator error: {}", e);
40//!         }
41//!     });
42//!
43//!     // 2. Load a model (downloads from HuggingFace if not cached)
44//!     // The AssetAuthority (Facecrab) handles resolution and downloading automatically.
45//!     input.send(BrainstemInput::LoadModel(
46//!         "tiny-model".into()
47//!     )).await?;
48//!
49//!     // 3. Submit a prompt
50//!     input.send(BrainstemInput::Infer {
51//!         prompt: "Once upon a time...".into(),
52//!         config: Default::default(),
53//!     }).await?;
54//!
55//!     // 4. Stream results
56//!     // The Cortex engine streams tokens back through the channel
57//!     while let Some(msg) = output.next().await {
58//!         match msg {
59//!             BrainstemOutput::Asset(a) => match a {
60//!                 AssetEvent::Complete(path) => println!("Model ready at: {}", path),
61//!                 AssetEvent::Error(e) => eprintln!("Download error: {}", e),
62//!                 _ => {}
63//!             },
64//!             BrainstemOutput::Event(e) => match e {
65//!                 InferenceEvent::Content(c) => print!("{}", c),
66//!                 InferenceEvent::Complete => break,
67//!                 _ => {}
68//!             },
69//!             BrainstemOutput::Error(err) => {
70//!                 eprintln!("Error: {}", err);
71//!                 break;
72//!             }
73//!         }
74//!     }
75//!
76//!     Ok(())
77//! }
78//! ```
79//!
80//! ## Hardware Acceleration
81//!
82//! To enable hardware acceleration, ensure you enable the appropriate feature in `Cargo.toml`:
83//!
84//! - **Metal**: `features = ["metal"]` (macOS Apple Silicon/Intel)
85//! - **CUDA**: `features = ["cuda"]` (NVIDIA GPUs)
86//! - **Vulkan**: `features = ["vulkan"]` (Generic/Intel GPUs)
87
88/// The Supplier: Asset management, model registry, and downloads (Facecrab).
89pub use facecrab;
90
91/// The Shared Vocabulary: Protocol enums, manifests, and error definitions.
92pub use rusty_genius_core as core;
93
94/// The Muscle: Inference engine bindings, KV cache, and logic processing (Cortex).
95pub use rusty_genius_cortex as cortex;
96
97/// The Orchestrator: Central event loop, lifecycle management, and strategy (Brainstem).
98pub use rusty_genius_stem as brainstem;
99
100// Convenience exports
101
102/// Main entry point for the orchestration event loop.
103pub use brainstem::Orchestrator;
104
105/// Top-level error type for the library.
106pub use core::error::GeniusError;