| Crates.io | synkit |
| lib.rs | synkit |
| version | 0.0.2 |
| created_at | 2026-01-02 11:42:00.713736+00 |
| updated_at | 2026-01-02 17:59:11.250532+00 |
| description | A fast, syn-like incremental parser framework for Rust |
| homepage | https://joshua-auchincloss.github.io/synkit |
| repository | https://github.com/joshua-auchincloss/synkit |
| max_upload_size | |
| id | 2018414 |
| size | 77,515 |
Generate syn-like parsing infrastructure from token definitions. Built on logos.
Define tokens once, get: lexer, typed token structs, whitespace-skipping streams, Parse/Peek/ToTokens traits, span tracking, and round-trip formatting.
| Use Case | synkit | Alternative |
|---|---|---|
| Custom DSL with formatting | Yes | - |
| Config file parser | Yes | serde + format crate |
| Code transformation | Yes | - |
| Rust source parsing | No | syn |
| Simple pattern matching | No | logos alone |
[dependencies]
synkit = "0.1"
logos = "0.16"
thiserror = "2"
Features: tokio, futures, serde, std (default).
synkit::parser_kit! {
error: MyError,
skip_tokens: [Space],
tokens: {
#[token(" ")]
Space,
#[token("=")]
Eq,
#[regex(r"[a-z]+", |lex| lex.slice().to_string())]
Ident(String),
#[regex(r"[0-9]+", |lex| lex.slice().parse().ok())]
Number(i64),
},
delimiters: {},
span_derives: [Debug, Clone, PartialEq],
token_derives: [Debug, Clone, PartialEq],
}
// Generated: Token enum, EqToken/IdentToken/NumberToken structs,
// TokenStream, Tok![] macro, Parse/Peek/ToTokens/Diagnostic traits
let mut stream = TokenStream::lex("x = 42")?;
let name: Spanned<IdentToken> = stream.parse()?;
let eq: Spanned<EqToken> = stream.parse()?;
let value: Spanned<NumberToken> = stream.parse()?;
| Module | Contents |
|---|---|
tokens |
Token enum, *Token structs, Tok![] macro |
stream |
TokenStream with fork/rewind, whitespace skipping |
span |
Span, Spanned<T> wrappers |
traits |
Parse, Peek, ToTokens, Diagnostic |
printer |
Round-trip formatting |
delimiters |
Bracket, Brace, Paren extractors |
Incremental parsing for network data and large files:
use synkit::async_stream::{IncrementalParse, AstStream};
impl IncrementalParse for MyNode {
fn parse_incremental(tokens: &[Token], checkpoint: &ParseCheckpoint)
-> Result<(Option<Self>, ParseCheckpoint), MyError>;
}
// Tokens flow through channels, AST nodes emit as parsed
let mut parser = AstStream::<MyNode, Token>::new(token_rx, ast_tx);
parser.run().await?;