#[non_exhaustive]pub struct ChatExtras {
pub entropy: Option<f64>,
}Expand description
Per-call extras returned alongside the chat response by LlmProvider::chat_with_extras.
Always paired 1:1 with a single response — no shared state, no races possible.
All optional fields default to None so providers that do not expose the
underlying API (e.g. Claude, Gemini) can simply return the default.
Marked #[non_exhaustive] so future fields (e.g. cached_tokens) can be added
without breaking match sites.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.entropy: Option<f64>Mean negative log-probability of the generated tokens, when the provider
was configured to request logprobs and the API supplied them.
Lower = more confident. Typical range: [0.0, ~6.0] for natural-language tokens.
Implementations§
Source§impl ChatExtras
impl ChatExtras
Sourcepub fn with_entropy(entropy: f64) -> Self
pub fn with_entropy(entropy: f64) -> Self
Return a ChatExtras with the given entropy value.
Used by MockProvider (test-only, enabled via testing feature) and OpenAI/Ollama providers.
§Examples
use zeph_llm::provider::ChatExtras;
let extras = ChatExtras::with_entropy(0.9);
assert_eq!(extras.entropy, Some(0.9));Trait Implementations§
Source§impl Clone for ChatExtras
impl Clone for ChatExtras
Source§fn clone(&self) -> ChatExtras
fn clone(&self) -> ChatExtras
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read more