| Crates.io | better_peekable |
| lib.rs | better_peekable |
| version | 1.0.0 |
| created_at | 2022-04-18 17:21:47.458957+00 |
| updated_at | 2025-10-28 12:49:31.330304+00 |
| description | Peekable iterator like std::iter::Peekable but allows for peeking n items ahead instead of just one. |
| homepage | |
| repository | https://github.com/weezy20/better_peekable |
| max_upload_size | |
| id | 569936 |
| size | 34,598 |
A no_std compatible iterator adapter that lets you peek multiple items ahead - perfect for building lexers, parsers, and other lookahead-heavy code.
Unlike std::iter::Peekable which only lets you peek one item ahead, BetterPeekable provides peek_n(n) to look arbitrarily far into your iterator without consuming items. Of course it also provides peek() which is the same as peek_n(0).
cargo add better_peekable
For no_std environments:
cargo add better_peekable --no-default-features --features alloc
use better_peekable::BetterPeekable;
let mut tokens = "if x == 42".chars().better_peekable();
// Look ahead to parse keywords
if tokens.peek() == Some(&'i') && tokens.peek_n(1) == Some(&'f') {
tokens.next(); // consume 'i'
tokens.next(); // consume 'f'
println!("Found 'if' keyword");
}
Here's how you might use it to build a simple lexer that needs lookahead:
use better_peekable::BetterPeekable;
#[derive(Debug, PartialEq)]
enum Token {
Number(i32),
Arrow, // ->
Minus, // -
Greater, // >
}
fn tokenize(input: &str) -> Vec<Token> {
let mut chars = input.chars().better_peekable();
let mut tokens = Vec::new();
while let Some(&ch) = chars.peek() {
match ch {
'0'..='9' => {
let mut num = 0;
while let Some(&digit) = chars.peek() {
if digit.is_ascii_digit() {
num = num * 10 + (chars.next().unwrap() as i32 - '0' as i32);
} else {
break;
}
}
tokens.push(Token::Number(num));
}
'-' => {
chars.next(); // consume '-'
// Look ahead for arrow operator
if chars.peek() == Some(&'>') {
chars.next(); // consume '>'
tokens.push(Token::Arrow);
} else {
tokens.push(Token::Minus);
}
}
'>' => {
chars.next();
tokens.push(Token::Greater);
}
' ' => { chars.next(); } // skip whitespace
_ => { chars.next(); } // skip unknown chars
}
}
tokens
}
fn main() {
let tokens = tokenize("42 -> 7 - 3");
println!("{:?}", tokens);
// Output: [Number(42), Arrow, Number(7), Minus, Number(3)]
}
For a complete lexer/parser example with advanced lookahead patterns, see examples/lexer.rs.
Run it with:
cargo run --example lexer
peek_n(n) lets you look n items aheadpeek calls don't affect the iterator stateDoubleEndedIterator, ExactSizeIterator, etc.no_std compatible: Works in embedded and WASM environments with alloc