tantivy-tokenizer-tiny-segmenter 0.1.0

A Japanese tokenizer for Tantivy, based on TinySegmenter.
docs.rs failed to build tantivy-tokenizer-tiny-segmenter-0.1.0
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build: tantivy-tokenizer-tiny-segmenter-0.3.0

tantivy-tokenizer-tiny-segmenter

A Japanese tokenizer based on TinySegmenter.

See examples/basic.rs for basic usage.