openinference-semantic-conventions 0.1.1

OpenInference semantic conventions for LLM observability in Rust
Documentation
  • Coverage
  • 93.2%
    192 out of 206 items documented1 out of 27 items with examples
  • Size
  • Source code size: 57.98 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 6.78 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 22s Average build duration of successful builds.
  • all releases: 22s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • Repository
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • cagyirey

openinference-rs

Rust implementation of OpenInference semantic conventions for LLM observability.

Crates.io Documentation License

Overview

OpenInference is a set of conventions for instrumenting LLM applications, compatible with OpenTelemetry and designed to work with observability platforms like Arize Phoenix.

This repository provides Rust crates for:

Crates

Crate Description
openinference-semantic-conventions Attribute constants and types
openinference-instrumentation Span builders and helpers

Quick Start

Add to your Cargo.toml:

[dependencies]
openinference-semantic-conventions = "0.1"
openinference-instrumentation = "0.1"

Basic Usage

use openinference_semantic_conventions::{attributes, SpanKind};
use openinference_instrumentation::LlmSpanBuilder;

// Create an LLM span with OpenInference attributes
let span = LlmSpanBuilder::new("gpt-4")
    .provider("openai")
    .temperature(0.7)
    .max_tokens(1000)
    .build();

let _guard = span.enter();
// ... perform LLM call ...

Using Attribute Constants Directly

use openinference_semantic_conventions::{attributes, gen_ai, SpanKind};
use opentelemetry::KeyValue;

// OpenInference attributes
let kind = KeyValue::new(attributes::OPENINFERENCE_SPAN_KIND, SpanKind::Llm.as_str());
let model = KeyValue::new(attributes::llm::MODEL_NAME, "gpt-4");
let tokens = KeyValue::new(attributes::llm::token_count::TOTAL, 150i64);

// OTel GenAI attributes (for dual compatibility)
let gen_ai_model = KeyValue::new(gen_ai::request::MODEL, "gpt-4");
let input_tokens = KeyValue::new(gen_ai::usage::INPUT_TOKENS, 100i64);

Span Kinds

OpenInference defines the following span kinds:

Kind Description
LLM Call to a Large Language Model
EMBEDDING Call to generate embeddings
CHAIN Workflow/pipeline step or glue code
TOOL External tool/function execution
AGENT Reasoning block using LLMs and tools
RETRIEVER Vector store or database query
RERANKER Document reranking
GUARDRAIL Input/output validation
EVALUATOR Model output evaluation

Dual Attribute Support

This library supports both OpenInference (llm.*) and OTel GenAI (gen_ai.*) attribute conventions for maximum compatibility:

use openinference_semantic_conventions::gen_ai;

// Map between conventions
let otel_key = gen_ai::map_openinference_to_gen_ai("llm.model_name");
let oi_key = gen_ai::map_gen_ai_to_openinference("gen_ai.request.model");

Integration with Observability Backends

Spans created with this library are compatible with:

Configuration

Environment Variables

# Privacy settings (opt-in content recording)
OPENINFERENCE_HIDE_INPUTS=false
OPENINFERENCE_HIDE_OUTPUTS=false

# OTel exporter configuration
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_SERVICE_NAME=my-llm-app

License

MIT OR Apache-2.0

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

Related Projects