Crates.io | sequencefile |
lib.rs | sequencefile |
version | 0.2.0 |
source | src |
created_at | 2016-01-31 07:14:12.674514 |
updated_at | 2021-11-23 04:39:27.828506 |
description | Native Rust library for working with Hadoop sequence files. Only handles reading currently. |
homepage | https://github.com/xorlev/rust-sequencefile.git |
repository | https://github.com/xorlev/rust-sequencefile.git |
max_upload_size | |
id | 4030 |
size | 45,879 |
Hadoop SequenceFile library for Rust
# Cargo.toml
[dependencies]
sequencefile = "0.2.0"
Prototype status! I'm in the process of learning Rust. :) Feedback appreciated.
Unfortunately that means the API will change. If you depend on this crate, please fully qualify your versions for now.
Currently supports reading out your garden-variety sequence file. Handles uncompressed sequencefiles as well as block/record compressed files (deflate, gzip, and bzip2 only). LZO and Snappy are not (yet) handled.
There's a lot more to do:
There aren't any formal benchmarks yet. However with deflate on my early 2012 MBP, 98.4% of CPU time was spent in miniz producing ~125MB/s of decompressed data.
let path = Path::new("/path/to/seqfile");
let file = File::open(&path).unwrap();
let seqfile = sequencefile::Reader::new(file).expect("Failed to open sequence file.");
for kv in seqfile {
println!("{:?}", kv); // Some(([123, 123], [456, 456]))
}
// Until there's automatic deserialization, you can do something like this:
// VERY hacky
let kvs = seqfile.map(|e| e.unwrap()).map(|(key, value)| {
(BigEndian::read_i64(&key),
String::from_utf8_lossy(&value[2..value.len()]).to_string())
});
for (k,v) in kvs {
println!("key: {}, value: {}", k, v);
}
rust-sequencefile is primarily distributed under the terms of both the MIT license and the Apache License (Version 2.0), with portions covered by various BSD-like licenses.
See LICENSE-APACHE, and LICENSE-MIT for details.