chinese_segmenter

Crates.iochinese_segmenter
lib.rschinese_segmenter
version1.0.1
sourcesrc
created_at2020-05-05 17:58:48.291745
updated_at2022-08-02 06:08:19.395019
descriptionTokenize Chinese sentences using a dictionary-driven largest first matching approach.
homepage
repositoryhttps://github.com/sotch-pr35mac/chinese_segmenter
max_upload_size
id237818
size8,133
Preston Wang-Stosur-Bassett (sotch-pr35mac)

documentation

README

segmenter

v1.0.0

About

Segment Chinese sentences into component words using a dictionary-driven largest first matching approach.

Usage

extern crate chinese_segmenter;

use chinese_segmenter::{initialize, tokenize};

let sentence = "今天晚上想吃羊肉吗?";
initialize(); // Optional intialization to load data
let result: Vec<&str> = tokenize(sentence);
println!("{:?}", result); // --> ['今天', '晚上', '想', '吃', '羊肉', '吗']

Contributors

License

MIT

Commit count: 13

cargo fmt