synkit

Crates.iosynkit
lib.rssynkit
version0.0.2
created_at2026-01-02 11:42:00.713736+00
updated_at2026-01-02 17:59:11.250532+00
descriptionA fast, syn-like incremental parser framework for Rust
homepagehttps://joshua-auchincloss.github.io/synkit
repositoryhttps://github.com/joshua-auchincloss/synkit
max_upload_size
id2018414
size77,515
Joshua Auchincloss (joshua-auchincloss)

documentation

https://docs.rs/synkit

README

synkit

Crates.io docs.rs CI

Generate syn-like parsing infrastructure from token definitions. Built on logos.

Define tokens once, get: lexer, typed token structs, whitespace-skipping streams, Parse/Peek/ToTokens traits, span tracking, and round-trip formatting.

When to Use

Use Case synkit Alternative
Custom DSL with formatting Yes -
Config file parser Yes serde + format crate
Code transformation Yes -
Rust source parsing No syn
Simple pattern matching No logos alone

Installation

[dependencies]
synkit = "0.1"
logos = "0.16"
thiserror = "2"

Features: tokio, futures, serde, std (default).

Example

synkit::parser_kit! {
    error: MyError,
    skip_tokens: [Space],
    tokens: {
        #[token(" ")]
        Space,
        #[token("=")]
        Eq,
        #[regex(r"[a-z]+", |lex| lex.slice().to_string())]
        Ident(String),
        #[regex(r"[0-9]+", |lex| lex.slice().parse().ok())]
        Number(i64),
    },
    delimiters: {},
    span_derives: [Debug, Clone, PartialEq],
    token_derives: [Debug, Clone, PartialEq],
}

// Generated: Token enum, EqToken/IdentToken/NumberToken structs,
// TokenStream, Tok![] macro, Parse/Peek/ToTokens/Diagnostic traits

let mut stream = TokenStream::lex("x = 42")?;
let name: Spanned<IdentToken> = stream.parse()?;
let eq: Spanned<EqToken> = stream.parse()?;
let value: Spanned<NumberToken> = stream.parse()?;

Generated Infrastructure

Module Contents
tokens Token enum, *Token structs, Tok![] macro
stream TokenStream with fork/rewind, whitespace skipping
span Span, Spanned<T> wrappers
traits Parse, Peek, ToTokens, Diagnostic
printer Round-trip formatting
delimiters Bracket, Brace, Paren extractors

Async Streaming

Incremental parsing for network data and large files:

use synkit::async_stream::{IncrementalParse, AstStream};

impl IncrementalParse for MyNode {
    fn parse_incremental(tokens: &[Token], checkpoint: &ParseCheckpoint)
        -> Result<(Option<Self>, ParseCheckpoint), MyError>;
}

// Tokens flow through channels, AST nodes emit as parsed
let mut parser = AstStream::<MyNode, Token>::new(token_rx, ast_tx);
parser.run().await?;

Documentation

Commit count: 0

cargo fmt