![Tusk Logo](https://user-images.githubusercontent.com/41837763/119643248-71553e00-be13-11eb-8749-0b0f0846e22e.png) # Lexer The lexical analysis component of Tusk. ## About This crate provides the `Lexer` and `Token` implementations used in Tusk. It allows you to provide a `&str` of input and stream `Token` instances on demand. ## Usage To use the crate, first add it to your `Cargo.toml`: ```toml [dependencies] tusk_lexer = "0.2.*" ``` To create a new `Lexer`, import the `struct` and use the `Lexer::new()` method. ```rust use tusk_lexer::Lexer; fn main() { let mut lexer = Lexer::new("$hello = 'cool'"); } ``` To get the next token from the input, use the `Lexer::next()` method: ```rust use tusk_lexer::Lexer; fn main() { let mut lexer = Lexer::new("$hello = 'cool'"); let maybe_some_token = lexer.next(); } ``` This method returns a `Token`. This struct has 3 fields: ```rust struct Token { pub kind: TokenType, pub slice: &str, pub range: TextRange, } ``` ## Contributing For more information, please read the [CONTRIBUTING](CONTRIBUTING.md) document. ## License This repository is distributed under the MIT license. For more information, please read the [LICENSE](LICENSE.md) document.