Expand description
Langfuse interceptor for OpenTelemetry-based LLM observability.
This interceptor automatically instruments OpenAI API calls with OpenTelemetry spans.
You must configure the OpenTelemetry tracer with Langfuse exporter separately.
§Usage
// 1. Build Langfuse exporter
let exporter = ExporterBuilder::from_env()?.build()?;
// 2. Create tracer provider
let provider = SdkTracerProvider::builder()
.with_span_processor(BatchSpanProcessor::builder(exporter, Tokio).build())
.build();
global::set_tracer_provider(provider.clone());
// 3. Create interceptor with tracer
let tracer = provider.tracer("openai-ergonomic");
let interceptor = LangfuseInterceptor::new(tracer, LangfuseConfig::new());
let client = Client::from_env()?
.with_interceptor(Box::new(interceptor))
.build();
// Traces are automatically sent to Langfuse
let response = client.send_chat(client.chat_simple("Hello!")).await?;Structs§
- Langfuse
Config - Configuration for the Langfuse interceptor.
- Langfuse
Interceptor - Langfuse interceptor for OpenTelemetry-based observability.
- Langfuse
State - State managed by the Langfuse interceptor across the request lifecycle.