saku 0.1.5

A simple yet efficient rule-based Japanese Sentence Tokenizer.
Documentation

Saku: Japanese Sentence Tokenizer

Saku is a library for splitting Japanese text into sentences based on hand-made rules written in Rust.
"割く(saku)" means "spliting something" in Japanese.

This library is named after a Japanese VTuber Saku Sasaki / 笹木咲.

This is the repository for original Rust implementations.