# Saku: Japanese Sentence Tokenizer **Saku** is a library for splitting Japanese text into sentences based on hand-made rules written in Rust. \ **"割く(saku)"** means "spliting something" in Japanese. This library is named after a Japanese VTuber [Saku Sasaki / 笹木咲](https://www.youtube.com/channel/UCoztvTULBYd3WmStqYeoHcA). This is the repository for original Rust implementations.