| Crates.io | token |
| lib.rs | token |
| version | 1.0.0-rc1 |
| created_at | 2015-02-21 23:25:36.390756+00 |
| updated_at | 2015-12-11 23:59:11.698689+00 |
| description | A simple string-tokenizer (and sentence splitter) Note: If you find that you would like to use the name for something more appropriate, please just send me a mail at jaln at itu dot dk |
| homepage | |
| repository | https://github.com/Machtan/token-rs |
| max_upload_size | |
| id | 1446 |
| size | 12,553 |
This is a small package containing a simple string-tokenizer for the rust programming language. The package also contains a simple sentence-splitting iterator.
(The sentence splitter might be moved, once I find out where I want it).
Documentation
machtan.github.io/token-rs/token
Add the following to your Cargo.toml file
[dependencies.token] git = "https://github.com/machtan/token-rs"
extern crate token;
let separators = vec![' ', '\n', '\t', '\r'];
let source: &str = " Hello world \n How do you do\t-Finely I hope";
let mut tokenizer = tokenizer::Tokenizer::new(source.as_bytes(), separators);
println!("Tokenizing...");
for token in tokenizer {
println!("- Got token: {}", token.unwrap());
}
println!("Done!");
MIT (do what you want with it)