chinese_segmenter 1.0.0

Tokenize Chinese sentences using a dictionary-driven largest first matching approach.
Documentation