Crates.io | combine |
lib.rs | combine |
version | 4.6.7 |
source | src |
created_at | 2015-07-17 14:19:14.002384 |
updated_at | 2024-04-10 09:56:27.206101 |
description | Fast parser combinators on arbitrary streams with zero-copy support. |
homepage | |
repository | https://github.com/Marwes/combine |
max_upload_size | |
id | 2624 |
size | 719,870 |
An implementation of parser combinators for Rust, inspired by the Haskell library Parsec. As in Parsec the parsers are LL(1) by default but they can opt-in to arbitrary lookahead using the attempt combinator.
extern crate combine;
use combine::{many1, Parser, sep_by};
use combine::parser::char::{letter, space};
// Construct a parser that parses *many* (and at least *1) *letter*s
let word = many1(letter());
// Construct a parser that parses many *word*s where each word is *separated by* a (white)*space*
let mut parser = sep_by(word, space())
// Combine can collect into any type implementing `Default + Extend` so we need to assist rustc
// by telling it that `sep_by` should collect into a `Vec` and `many1` should collect to a `String`
.map(|mut words: Vec<String>| words.pop());
let result = parser.parse("Pick up that word!");
// `parse` returns `Result` where `Ok` contains a tuple of the parsers output and any remaining input.
assert_eq!(result, Ok((Some("word".to_string()), "!")));
Larger examples can be found in the examples, tests and benches folders.
A tutorial as well as explanations on what goes on inside combine can be found in the wiki.
Parse arbitrary streams - Combine can parse anything from &[u8]
and &str
to iterators and Read
instances. If none of the builtin streams fit your use case you can even implement a couple traits your self to create your own custom stream!
zero-copy parsing - When parsing in memory data, combine can parse without copying. See the range module for parsers specialized for zero-copy parsing.
partial parsing - Combine parsers can be stopped at any point during parsing and later be resumed without losing any progress. This makes it possible to start parsing partial data coming from an io device such as a socket without worrying about if enough data is present to complete the parse. If more data is needed the parser will stop and may be resumed at the same point once more data is available. See the async example for an example and this post for an introduction.
A parser combinator is, broadly speaking, a function which takes several parsers as arguments and returns a new parser, created by combining those parsers. For instance, the many parser takes one parser, p
, as input and returns a new parser which applies p
zero or more times. Thanks to the modularity that parser combinators gives it is possible to define parsers for a wide range of tasks without needing to implement the low level plumbing while still having the full power of Rust when you need it.
The library adheres to semantic versioning.
If you end up trying it I welcome any feedback from your experience with it. I am usually reachable within a day by opening an issue, sending an email or posting a message on Gitter.
Since combine
aims to crate parsers with little to no overhead, streams over &str
and &[T]
do not carry any extra position information, but instead, they only rely on comparing the pointer of the buffer to check which Stream
is further ahead than another Stream
. To retrieve a better position, either call translate_position
on the PointerOffset
which represents the position or wrap your stream with State
.
https://github.com/Marwes/combine/issues/73 contains discussion and links to comparisons to nom.
There is an additional crate which has parsers to lex and parse programming languages in combine-language.
The easiest way to contribute is to just open an issue about any problems you encounter using combine but if you are interested in adding something to the library here is a list of some of the easier things to work on to get started.