Crates.io | tinytoken |
lib.rs | tinytoken |
version | 0.1.2 |
source | src |
created_at | 2024-11-09 14:56:36.429111 |
updated_at | 2024-11-09 15:22:46.995286 |
description | Library for tokenizing text into words, numbers, symbols, and more, with customizable parsing options. |
homepage | https://github.com/luxluth/tinytoken#readme |
repository | https://github.com/luxluth/tinytoken |
max_upload_size | |
id | 1442134 |
size | 27,094 |
This library provides a tokenizer for parsing and categorizing different types of tokens, such as words, numbers, strings, characters, symbols, and operators. It includes configurable options to handle various tokenization rules and formats, enabling fine-grained control over how text input is parsed.
use tinytoken::{Tokenizer, TokenizerBuilder, Choice};
fn main() {
let tokenizer = TokenizerBuilder::new()
.parse_char_as_string(true)
.allow_digit_separator(Choice::Yes('_'))
.add_symbol('$')
.add_operators(&['+', '-'])
.build("let x = 123_456 + 0xFF");
match tokenizer.tokenize() {
Ok(tokens) => {
for token in tokens {
println!("{:?}", token);
}
}
Err(err) => {
eprintln!("Tokenization error: {err}");
}
}
}
Feel free to send a PR to improve and/or extend the tool capabilities