neuron-types
Foundation crate for the neuron ecosystem. Defines the core types, traits, and
error enums that every other neuron crate depends on. Contains zero logic --
only data structures, trait definitions, and serde implementations. This is the
equivalent of serde's core: traits live here, implementations live in
satellite crates.
Installation
Key Types
Message-- a conversation message with aRoleandVec<ContentBlock>. Convenience constructors:Message::user(),Message::assistant(),Message::system().Role--User,Assistant, orSystemContentBlock-- text, thinking, tool use/result, image, document, or server-side compaction summaryCompletionRequest-- full LLM request: model, messages, system prompt, tools, temperature, thinking config, context managementCompletionResponse-- LLM response: message, token usage, stop reasonToolDefinition-- tool name, description, and JSON Schema for inputToolOutput-- tool execution result with content items and optional structured JSONToolContext-- runtime context (cwd, session ID, environment, cancellation token). ImplementsDefaultfor zero-config construction.TokenUsage-- input/output/cache/reasoning token countsUsageIteration-- per-iteration token breakdown (during server-side compaction)ContextManagement,ContextEdit-- server-side context management configurationStopReason-- why the model stopped:EndTurn,ToolUse,MaxTokens,StopSequence,ContentFilter,CompactionEmbeddingRequest-- embedding model request: model, input texts, optional dimensionsEmbeddingResponse-- embedding model response: vectors, usageEmbeddingUsage-- token counts for an embedding requestUsageLimits-- token usage budget constraints (request tokens, response tokens, total tokens) enforced by the agentic loop
Key Traits
Provider-- LLM provider withcomplete()andcomplete_stream()(RPITIT, not object-safe)EmbeddingProvider-- embedding model provider withembed()(RPITIT, separate from Provider)Tool-- strongly typed tool withNAME,Args,Output,Errorassociated typesToolDyn-- type-erased tool for heterogeneous registries (blanket-implemented for allToolimpls)ContextStrategy-- context compaction:should_compact(),compact(),token_estimate()ObservabilityHook-- logging/metrics/telemetry hooks returning aHookAction:Continue— proceed normallySkip— reject the tool call and return the reason as a tool result so the model can adaptTerminate— halt the loop immediately
DurableContext-- wraps side effects for durable execution engines (Temporal, Restate)PermissionPolicy-- tool call permission checks returning:Allow— permit the callDeny(reason)— block the callAsk(question)— prompt the user for confirmation
Usage
use ;
// Construct a message with convenience constructor
let message = user;
// Build a completion request (only specify what you need)
let request = CompletionRequest ;
Implementing the Provider trait (Rust 2024 native async, no #[async_trait]).
This example is marked ignore because it's an abstract skeleton — a real
implementation would make HTTP calls to an LLM API:
use *;
Part of neuron
This crate is part of neuron, a composable building-blocks library for AI agents in Rust.
License
Licensed under either of Apache License, Version 2.0 or MIT License at your option.