Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Orichalcum: An Agent Orchestration Framework in Rust
License: MIT | Crates.io: v0.4.0 | Docs: docs.rs
A brutally-safe, composable agent orchestration framework for building complex, multi-step workflows.
Status
⚠️ This library is in early development (0.x). The API is unstable and may change.
What is Orichalcum?
You've looked at LLM agent frameworks and thought, "This is neat. But is it memory-safe?"
You crave the sweet agony of the borrow checker. You yearn for the moral superiority that comes with writing everything in Rust. You, my friend, are a true masochist. And this is the LLM framework for you.
Orichalcum is a spiritual successor to Python's PocketFlow, inheriting its philosophy of extreme composability. It allows you to define complex workflows (or "Flows") by chaining together simple, reusable components ("Nodes"). Each Node is a self-contained unit of work that can read from and write to a shared state, making decisions about what Node to execute next.
Core Concepts
- Node: The fundamental unit of work. A
Nodeencapsulates a piece of logic with three steps:prep(prepare inputs),exec(execute the core logic), andpost(process results and update state). - Flow: A special
Nodethat orchestrates a graph of otherNodes. It manages the execution sequence based on the outputs of eachNode. - Shared State: A
HashMapthat is passed through the entireFlow. Nodes can read from this state to get context and write to it to pass results to subsequent nodes. - Semantic Layer (v0.4.0): Define structural contracts for your nodes using
Signature. This allows for compile-time or runtime validation of your workflows.
Installation
Add Orichalcum to your project's Cargo.toml:
[]
= "0.4.0"
# For LLM features (Ollama, Gemini, DeepSeek)
# orichalcum = { version = "0.4.0", features = ["llm"] }
# For Telemetry features (tracing, optimization registry)
# orichalcum = { version = "0.4.0", features = ["telemetry"] }
Quick Start: Semantic LLM Nodes (v0.4.0)
The most powerful way to use Orichalcum is via Semantic Nodes. These nodes have defined input/output contracts and are "sealed" for production stability.
use *;
async
Traditional Example: A Simple Sync Flow
Orichalcum still supports pure Rust logic nodes for local processing.
use *;
;
;
Features
- Semantic Layer: Define I/O contracts with
Signaturefor brutally-safe data flow. - Telemetry (v0.4.0): Built-in tracing for I/O, model names, and execution timestamps.
- Unified LLM Builders: Fluent API for
Gemini,DeepSeek, andOllama. - Async & Parallel: First-class support for
tokioand parallel batch processing. - Nix Support: Includes
flake.nixfor a reproducible development environment.
Contributing
Contributions are welcome! Please feel free to open an issue or submit a pull request.
License
This project is licensed under the MIT License.