Module chat_template

Module chat_template 

Source
Expand description

Chat template support for LLM examples

This module provides Jinja-based chat template rendering compatible with HuggingFace’s tokenizer.apply_chat_template() functionality.

§Example

use candle_examples::chat_template::{ChatTemplate, ChatTemplateOptions, Message, Conversation};

// Load template from a model's tokenizer_config.json
let template = ChatTemplate::from_tokenizer_config("path/to/tokenizer_config.json")?;

// Or use a preset for known models
let template = ChatTemplate::chatml(); // SmolLM, Qwen, etc.

// Single-turn
let messages = vec![
    Message::system("You are helpful."),
    Message::user("Hello!"),
];
let prompt = template.apply_for_generation(&messages)?;

// Multi-turn conversation
let mut conv = Conversation::new(template, "You are helpful.");
let prompt = conv.user_turn("Hello!")?;
// ... generate response ...
conv.assistant_response("Hi there!");
let prompt = conv.user_turn("How are you?")?;

Structs§

ChatTemplate
Chat template renderer using MiniJinja
ChatTemplateOptions
Options for applying a chat template
Conversation
Multi-turn conversation manager
Message
A chat message with role and content
NamedTemplate
TokenConfig
Token configuration loaded from tokenizer_config.json

Enums§

ChatTemplateConfig
Chat template can be a single string or multiple named templates
ChatTemplateError
Errors that can occur with chat templates
StringOrToken
Handle both string and object token formats in tokenizer_config.json