| Crates.io | huffman-compress |
| lib.rs | huffman-compress |
| version | 0.7.0 |
| created_at | 2018-01-29 00:23:21.08549+00 |
| updated_at | 2025-12-16 21:41:56.92883+00 |
| description | Huffman compression given a probability distribution over arbitrary symbols |
| homepage | |
| repository | https://github.com/niklasf/rust-huffman-compress |
| max_upload_size | |
| id | 48719 |
| size | 39,449 |
Huffman compression given a probability distribution over arbitrary symbols.
This project has limited real-world utility. It may be useful to experiment with or learn about Huffman coding (for example, when working on bespoke chess game compression for lichess.org), but there are better entropy coders (both in terms of compression ratio and performance) and better implementations.
See constriction for composable
entropy coders, models and streams.
See arcode for a standalone implementation
of arithmetic coding.
use std::iter::FromIterator;
use std::collections::HashMap;
use bit_vec::BitVec;
use huffman_compress::{CodeBuilder, Book, Tree};
let mut weights = HashMap::new();
weights.insert("CG", 293);
weights.insert("AG", 34);
weights.insert("AT", 4);
weights.insert("CT", 4);
weights.insert("TG", 1);
// Construct a Huffman code based on the weights (e.g. counts or relative
// frequencies).
let (book, tree) = CodeBuilder::from_iter(weights).finish();
// More frequent symbols will be encoded with fewer bits.
assert!(book.get("CG").map_or(0, |cg| cg.len()) <
book.get("AG").map_or(0, |ag| ag.len()));
// Encode some symbols using the book.
let mut buffer = BitVec::new();
let example = vec!["AT", "CG", "AT", "TG", "AG", "CT", "CT", "AG", "CG"];
for symbol in &example {
book.encode(&mut buffer, symbol);
}
// Decode the symbols using the tree.
let decoded: Vec<&str> = tree.decoder(&buffer, example.len()).collect();
assert_eq!(decoded, example);
bit-vec 0.8.#[deny(warnings)] (a future
compatibility hazard in libraries).bit-vec 0.6.bit-vec 0.5.Tree::decoder() to Tree::unbounded_decoder() to avoid
surprises. A new Tree::decoder() takes the maximum number of symbols to
decode.Saturating from num-traits.CodeBuilder.HashMap. Thanks
@mernen.K: Ord instead of K: Hash + Eq for symbols and switch Book
internals from HashMap to BTreeMap.Book.huffman-compress is dual licensed under the Apache 2.0 and MIT license, at your option.