chinese_segmenter 1.0.1

Tokenize Chinese sentences using a dictionary-driven largest first matching approach.
Documentation
  • Feature flags
  • This release does not have any feature flags.

chinese_segmenter

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This release does not have any feature flags.