chinese_segmenter 0.1.1

Tokenize Chinese sentences using a dictionary-driven largest first matching approach.
Documentation