| Crates.io | bufjson |
| lib.rs | bufjson |
| version | 0.3.0 |
| created_at | 2025-08-18 14:36:25.029466+00 |
| updated_at | 2025-09-22 04:39:16.637549+00 |
| description | No frills, low-alloc, low-copy JSON lexer/parser for fast stream-oriented parsing |
| homepage | |
| repository | https://github.com/vcschapp/bufjson |
| max_upload_size | |
| id | 1800562 |
| size | 245,662 |
bufjson. A low-level, low-allocation, low-copy JSON tokenizer and parser geared toward
efficient stream processing at scale.
Add bufjson to your Cargo.toml or run $ cargo add bufjson.
Here's a simple example that parses a JSON text for syntax validity and prints it with the insignificant whitespace stripped out.
use bufjson::{lexical::{Token, fixed::FixedAnalyzer}, syntax::Parser};
fn strip_whitespace(json_text: &str) {
let mut parser = Parser::new(FixedAnalyzer::new(json_text.as_bytes()));
loop {
match parser.next_non_white() {
Token::Eof => break,
Token::Err => panic!("{}", parser.err()),
_ => print!("{}", parser.content().literal()),
};
}
}
fn main() {
// Prints `{"foo":"bar","baz":[null,123]}`
strip_whitespace(r#"{ "foo": "bar", "baz": [null, 123] }"#);
}
The bufjson crate provides a stream-oriented JSON tokenizer through the lexical::Analyzer trait, with these implementations:
FixedAnalyzer tokenizes fixed-size buffers;ReadAnalyzer tokenizes sync input streams implementing io::Read (COMING SOON-ISH); andAsyncAnalyzer tokenizes async streams that yield byte buffers (COMING SOON-ISH);The remainder of the library builds on the lexical analyzer trait.
syntax module provides concrete stream-oriented parser types that can wrap any lexical
analyzer.path module, which will support stream-oriented evaluation of a limited subset of JSONPath,
is planned. (COMING SOON-ISH)Refer to the API reference docs for more detail.
Choose bufjson when you need to:
Other libraries are more suitable for:
serde_json or simd-json).serde_json).serde_json or json-writer).Benchmarks coming soon-ish.