Skip to main content

Tokenizer

Struct Tokenizer 

Source
pub struct Tokenizer { /* private fields */ }
Expand description

High-level tokenizer. Holds a compiled dictionary and segmentation options.

§Example

use kham_core::Tokenizer;

let tok = Tokenizer::new();
let tokens = tok.segment("กินข้าวกับปลา");
assert!(!tokens.is_empty());

Implementations§

Source§

impl Tokenizer

Source

pub fn new() -> Self

Create a tokenizer with the built-in dictionary and TNC frequency table.

Source

pub fn normalize(&self, text: &str) -> String

Normalise Thai text into canonical form.

This is a convenience wrapper around normalizer::normalize. Because segment is zero-copy, normalization must happen before segmentation. The caller owns the returned alloc::string::String and can then borrow it for segment:

use kham_core::Tokenizer;

let tok = Tokenizer::new();
// Input with a doubled tone mark and decomposed Sara Am
let raw = "\u{0E01}\u{0E34}\u{0E19}\u{0E19}\u{0E49}\u{0E4D}\u{0E32}"; // กิน + น + ้ + อํ + อา
let normalized = tok.normalize(raw); // น้ำ composed, no dedup needed here
let tokens = tok.segment(&normalized); // tokens borrow `normalized`
assert!(!tokens.is_empty());
Source

pub fn builder() -> TokenizerBuilder

Return a TokenizerBuilder for custom configuration.

§Example
use kham_core::Tokenizer;

// Use built-in dict (no extra words needed here)
let tok = Tokenizer::builder().build();
let tokens = tok.segment("สวัสดีชาวโลก");
assert!(!tokens.is_empty());
Source

pub fn segment<'t>(&self, text: &'t str) -> Vec<Token<'t>>

Segment text into tokens.

Returns a Vec<Token<'_>> where every token’s text is a zero-copy sub-slice of text.

Non-Thai spans (Latin, Number, Whitespace, Emoji, Punctuation) pass through unchanged. Thai spans are segmented with the newmm DAG algorithm constrained to TCC boundaries.

§Example
use kham_core::{Tokenizer, TokenKind};

let tok = Tokenizer::new();
// Mixed Thai + number + Thai
let tokens = tok.segment("ธนาคาร100แห่ง");
assert_eq!(tokens[1].text, "100");
assert_eq!(tokens[1].kind, TokenKind::Number);

Trait Implementations§

Source§

impl Default for Tokenizer

Source§

fn default() -> Self

Returns the “default value” for a type. Read more

Auto Trait Implementations§

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.