saku 0.1.6

A simple yet efficient rule-based Japanese Sentence Tokenizer.
Documentation
1
2
3
4
5
6
7
8
9
10
# Saku: Japanese Sentence Tokenizer

**Saku** is a library for splitting Japanese text into sentences based on hand-made rules written in Rust. \
**"割く(saku)"** means "spliting something" in Japanese.


This library is named after a Japanese VTuber [Saku Sasaki / 笹木咲](https://www.youtube.com/channel/UCoztvTULBYd3WmStqYeoHcA).


This is the repository for original Rust implementations.