Module tokens

Module tokens 

Source
Expand description

Token counting utilities using BPE tokenization.

This module provides approximate token counting for resource content using the cl100k BPE encoding, which is compatible with Claude and GPT-4 models.

§Usage

use agpm_cli::tokens;

let content = "Hello, world!";
let count = tokens::count_tokens(content);
println!("Approximate token count: {}", count);

§Performance

The tokenizer is lazily initialized on first use and cached for subsequent calls. Token counting is O(n) and optimized for high throughput.

Functions§

count_tokens
Count approximate tokens in content using cl100k encoding.
format_token_count
Format a token count for human-readable display.