| Crates.io | r-toml |
| lib.rs | r-toml |
| version | 0.0.26 |
| created_at | 2025-03-22 20:16:31.280194+00 |
| updated_at | 2025-03-22 20:31:40.892579+00 |
| description | Regular subset of TOML |
| homepage | |
| repository | https://github.com/ieviev/r-toml |
| max_upload_size | |
| id | 1602073 |
| size | 126,644 |
Features and non-features:


I very much like the TOML format for flat key-value storage and for the features that are supported, it is fully compatible with TOML. This means you don't need to learn anything new and can use existing tooling like strongly typed schemas in the editor.
The TOML format has a beautiful property that nested types can be expressed in a regular grammar, and this library takes advantage of that. It's been an experimental project on my shelf for a bit and there's still more things to add, but if you don't use any fancier features then it is usable already. Of course TOML was never meant to be a data storage format and this is misusing the original purpose but perhaps the fact that regular-toml can be parsed much faster may change your mind.
let toml : byte[] = "
[server]
port = 8080
hostname = 'abc'
"B
let dictionary = RToml.toDictionary(toml)
dictionary["server.port"].kind // INT
dictionary["server.port"].ToInt(toml) // 8080
// or any of the other formats
let array = RToml.toArray(toml)
let array2 = RToml.toStructArray(toml)
let valuelist =
use vlist = RToml.toValueList(toml)
for v in vlist do () //.. do something
// or iterate over the key-value pairs
RToml.stream (
toml,
(fun key value ->
if value.kind = Token.TRUE then
let keystr = key.ToString toml // struct to string
printfn $"{keystr} at pos:{key.key_begin} is set to true"
)
)
fn main() {
let toml = b"
[server]
port = 8080
hostname = 'abc'
";
let map = r_toml::to_map(toml).unwrap();
dbg!(&map["server.port"].kind); // INT
dbg!(&map["server.port"].to_int(toml)); // Ok(8080)
// or iterate over key-value pairs
let mut key_buf = Vec::new();
r_toml::stream(toml, |k, v| {
println!("{} = {:?}", k.to_str(&mut key_buf, toml), v.kind);
key_buf.clear();
});
}
true/false,10,0.005, 'string'1979-05-27T07:32:00Z[entry], [entry.inner][[products]][1, 2, 3]# commentperson = { address = { postcode = 123, street = "abc" } }
however you can do any of these
person.address.postcode = 123
person.address.street = "abc"
[person]
address.postcode = 123
address.street = "abc"
[person.address]
postcode = 123
street = "abc"
datapoints = [[[0,1,3],"abc"],{ x = 1, y = 2}]
There is one particular type of TOML key that is not supported:
"person"."address"."name"."this"."can"."be"."very"."long"
["person"."address"."name"]
[["person"."address"."name"]]
As it's unlikely to see quoted keys in most toml files, there is no support for quoted keys.
The reason is that to parse this key there needs to be some form of conversion, collection or book-keeping to extract the values inside the quotes. While this is entirely regular and there's no problem actually reading this, this violates the Streaming, Zero-Copy, Stackless property as you need to separate the keys from the quotes somehow. Just a single quoted key is technically fine, that may be added some time.
Whether a string contains an escape sequence is detected and added as a flag to the string, however there is no implementation to do the actual escaping right now, so if you use escape sequences you need to perform escaping yourself.
The string kinds after parsing are labeled as follows:
VALID_STR : valid string, no post-processing needed
ESC_STR : escape sequence detected
EMPTY_STR: the string is empty, this is just a special case to prevent redundant workBased on an arbitrary decision, the only arrays supported are all of single type. The parser returns the region where the values are, which have to be post-processed after. The region itself is already validated to meet TOML spec, that is a token of one of the following types:
ARR1_STR:
"asd", 'ghi', """123"""ARR1_INT:
1, -2, +3ARR1_FLOAT:
1.0, inf, nanARR1_BOOL:
true, falsethere is no iterator for the array values right now but in the future maybe there will
as a visual example to how the input is processed into tokens see:

proper benchmark data extracted from Cargo.toml or pyproject.toml files
iterator for array types
depth 2 arrays of basic primitives: bool[][],int[][],float[][],string[][]
code-gen for other languages
array of tables deserialization into an iterator for convenience
string to tagged union deserialization
perf. comparisons with other serialization formats (json)
avx2/avx512 intrinsics for strings