pub struct ChatTemplateResult {
pub prompt: String,
pub grammar: Option<String>,
pub grammar_lazy: bool,
pub grammar_triggers: Vec<GrammarTrigger>,
pub preserved_tokens: Vec<String>,
pub additional_stops: Vec<String>,
pub chat_format: i32,
pub parser: Option<String>,
pub thinking_forced_open: bool,
pub parse_tool_calls: bool,
}Expand description
Result of applying a chat template with tool grammar support.
Fields§
§prompt: StringRendered chat prompt.
grammar: Option<String>Optional grammar generated from tool definitions.
grammar_lazy: boolWhether to use lazy grammar sampling.
grammar_triggers: Vec<GrammarTrigger>Lazy grammar triggers derived from the template.
preserved_tokens: Vec<String>Tokens that should be preserved for sampling.
additional_stops: Vec<String>Additional stop sequences added by the template.
chat_format: i32Chat format used for parsing responses.
parser: Option<String>Optional serialized PEG parser for tool-call parsing.
thinking_forced_open: boolWhether the parser expects a forced-open thinking block.
parse_tool_calls: boolWhether tool calls should be parsed from the response.
Implementations§
Source§impl ChatTemplateResult
impl ChatTemplateResult
Sourcepub fn parse_response_oaicompat(
&self,
text: &str,
is_partial: bool,
) -> Result<String, ChatParseError>
pub fn parse_response_oaicompat( &self, text: &str, is_partial: bool, ) -> Result<String, ChatParseError>
Parse a generated response into an OpenAI-compatible message JSON string.
§Errors
Returns an error if the FFI call fails or the result is null.
Sourcepub fn streaming_state_oaicompat(
&self,
) -> Result<ChatParseStateOaicompat, ChatParseError>
pub fn streaming_state_oaicompat( &self, ) -> Result<ChatParseStateOaicompat, ChatParseError>
Initialize a streaming parser for OpenAI-compatible chat deltas.
§Errors
Returns an error if the parser state cannot be initialized.
Source§impl ChatTemplateResult
impl ChatTemplateResult
Sourcepub fn build_grammar_sampler(
&self,
model: &LlamaModel,
) -> Result<(Option<LlamaSampler>, HashSet<LlamaToken>), GrammarSamplerError>
pub fn build_grammar_sampler( &self, model: &LlamaModel, ) -> Result<(Option<LlamaSampler>, HashSet<LlamaToken>), GrammarSamplerError>
Builds a grammar sampler from this template result’s grammar and trigger configuration.
Returns None if no grammar is present. The returned HashSet contains preserved
token IDs that should be decoded with special token handling.
§Errors
Returns an error if trigger processing or grammar sampler initialization fails.
Trait Implementations§
Source§impl Clone for ChatTemplateResult
impl Clone for ChatTemplateResult
Source§fn clone(&self) -> ChatTemplateResult
fn clone(&self) -> ChatTemplateResult
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read more