openinference-rs
Rust implementation of OpenInference semantic conventions for LLM observability.
Overview
OpenInference is a set of conventions for instrumenting LLM applications, compatible with OpenTelemetry and designed to work with observability platforms like Arize Phoenix.
This repository provides Rust crates for:
- Semantic conventions - Attribute constants matching the OpenInference specification
- OTel GenAI compatibility - Aliases for OpenTelemetry GenAI semantic conventions
- Instrumentation helpers - Span builders for easy integration with the
tracingcrate
Crates
| Crate | Description |
|---|---|
openinference-semantic-conventions |
Attribute constants and types |
openinference-instrumentation |
Span builders and helpers |
Quick Start
Add to your Cargo.toml:
[]
= "0.1"
= "0.1"
Basic Usage
use ;
use LlmSpanBuilder;
// Create an LLM span with OpenInference attributes
let span = new
.provider
.temperature
.max_tokens
.build;
let _guard = span.enter;
// ... perform LLM call ...
Using Attribute Constants Directly
use ;
use KeyValue;
// OpenInference attributes
let kind = new;
let model = new;
let tokens = new;
// OTel GenAI attributes (for dual compatibility)
let gen_ai_model = new;
let input_tokens = new;
Span Kinds
OpenInference defines the following span kinds:
| Kind | Description |
|---|---|
LLM |
Call to a Large Language Model |
EMBEDDING |
Call to generate embeddings |
CHAIN |
Workflow/pipeline step or glue code |
TOOL |
External tool/function execution |
AGENT |
Reasoning block using LLMs and tools |
RETRIEVER |
Vector store or database query |
RERANKER |
Document reranking |
GUARDRAIL |
Input/output validation |
EVALUATOR |
Model output evaluation |
Dual Attribute Support
This library supports both OpenInference (llm.*) and OTel GenAI (gen_ai.*) attribute conventions for maximum compatibility:
use gen_ai;
// Map between conventions
let otel_key = map_openinference_to_gen_ai;
let oi_key = map_gen_ai_to_openinference;
Integration with Observability Backends
Spans created with this library are compatible with:
- Arize Phoenix (native OpenInference support)
- Grafana Tempo (via OTel)
- Jaeger (via OTel)
- Datadog (OTel GenAI support)
- Honeycomb (via OTel)
- Any OpenTelemetry-compatible backend
Configuration
Environment Variables
# Privacy settings (opt-in content recording)
OPENINFERENCE_HIDE_INPUTS=false
OPENINFERENCE_HIDE_OUTPUTS=false
# OTel exporter configuration
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_SERVICE_NAME=my-llm-app
License
MIT OR Apache-2.0
Contributing
Contributions are welcome! Please feel free to submit issues and pull requests.
Related Projects
- OpenInference - Original specification and Python/JS implementations
- Arize Phoenix - AI observability platform
- OpenTelemetry Rust - Rust OTel SDK