chinese_segmenter 0.1.0

Tokenize Chinese sentences using a dictionary-driven largest first matching approach.
Documentation

segmenter

v0.1.0

About

Segment Chinese sentences into component words using a dictionary-driven largest first matching approach.

Usage

extern crate chinese_segmenter;

use chinese_segmenter::ChineseSegmenter;

let segmenter = ChineseSegmenter::new();

let sentence: String = String::from("今天晚上想吃羊肉吗?");
let result: Vec<String> = segmenter.tokenize(sentence);
println!("{:?}", result); // --> ['今天', '晚上', '想', '吃', '羊肉', '吗']

Contributors

License

MIT