Expand description
§neuron-types
Foundation crate for the neuron ecosystem. Defines the core types, traits, and
error enums that every other neuron crate depends on. Contains zero logic –
only data structures, trait definitions, and serde implementations. This is the
equivalent of serde’s core: traits live here, implementations live in
satellite crates.
§Key Types
Message– a conversation message with aRoleandVec<ContentBlock>Role–User,Assistant, orSystemContentBlock– text, thinking, tool use/result, image, or documentCompletionRequest– full LLM request: model, messages, system prompt, tools, temperature, thinking configCompletionResponse– LLM response: message, token usage, stop reasonToolDefinition– tool name, description, and JSON Schema for inputToolOutput– tool execution result with content items and optional structured JSONToolContext– runtime context (cwd, session ID, environment, cancellation token)TokenUsage– input/output/cache/reasoning token countsStopReason– why the model stopped:EndTurn,ToolUse,MaxTokens,StopSequence,ContentFilter
§Key Traits
Provider– LLM provider withcomplete()andcomplete_stream()(RPITIT, not object-safe)Tool– strongly typed tool withNAME,Args,Output,Errorassociated typesToolDyn– type-erased tool for heterogeneous registries (blanket-implemented for allToolimpls)ContextStrategy– context compaction:should_compact(),compact(),token_estimate()ObservabilityHook– logging/metrics/telemetry hooks withContinue/Skip/TerminateactionsDurableContext– wraps side effects for durable execution engines (Temporal, Restate)PermissionPolicy– tool call permission checks returningAllow/Deny/Ask
§Usage
use neuron_types::{Message, Role, ContentBlock, CompletionRequest};
// Construct a message with text content
let message = Message {
role: Role::User,
content: vec![ContentBlock::Text("What is 2 + 2?".into())],
};
// Build a completion request
let request = CompletionRequest {
model: "claude-sonnet-4-20250514".into(),
messages: vec![message],
system: Some("You are a calculator.".into()),
tools: vec![],
max_tokens: Some(1024),
temperature: Some(0.0),
top_p: None,
stop_sequences: vec![],
tool_choice: None,
response_format: None,
thinking: None,
reasoning_effort: None,
extra: None,
};Implementing the Provider trait (Rust 2024 native async, no #[async_trait]):
ⓘ
use neuron_types::*;
struct MyProvider { /* ... */ }
impl Provider for MyProvider {
fn complete(&self, request: CompletionRequest)
-> impl Future<Output = Result<CompletionResponse, ProviderError>> + Send
{
async { todo!() }
}
fn complete_stream(&self, request: CompletionRequest)
-> impl Future<Output = Result<StreamHandle, ProviderError>> + Send
{
async { todo!() }
}
}§Part of neuron
This crate is part of neuron, a composable building-blocks library for AI agents in Rust.
§License
Licensed under either of Apache License, Version 2.0 or MIT License at your option.