chinese_segmenter 1.0.1

Tokenize Chinese sentences using a dictionary-driven largest first matching approach.
Documentation
1
2
3
4
5
6
7
8
9
10
.DEFAULT_GOAL := build

build:
	cargo build

test:
	cargo test

format:
	cargo fmt