Crates.io | regex-lexer |
lib.rs | regex-lexer |
version | 0.2.0 |
source | src |
created_at | 2020-05-20 20:41:04.719396 |
updated_at | 2022-08-14 18:14:52.133801 |
description | A regex-based lexer (tokenizer) |
homepage | https://github.com/krsnik02/regex-lexer/ |
repository | https://github.com/krsnik02/regex-lexer/ |
max_upload_size | |
id | 243931 |
size | 27,379 |
A regex-based lexer (tokenizer) in Rust.
enum Tok {
Num,
// ...
}
let lexer = regex_lexer::LexerBuilder::new()
.token(r"[0-9]+", Tok::Num)
.ignore(r"\s+") // skip whitespace
// ...
.build();
let tokens = lexer.tokens(/* source */);
Licensed under either of
at your option.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusing in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.