Crates.io | hdt |
lib.rs | hdt |
version | |
source | src |
created_at | 2022-10-13 08:18:49.719079 |
updated_at | 2024-11-22 09:37:19.909753 |
description | Library for the Header Dictionary Triples (HDT) RDF compression format. |
homepage | |
repository | https://github.com/konradhoeffner/hdt |
max_upload_size | |
id | 687051 |
Cargo.toml error: | TOML parse error at line 21, column 1 | 21 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
A Rust library for the Header Dictionary Triples compressed RDF format, including:
However it cannot:
For this functionality and acknowledgement of all the original authors, please look at the reference implementations in C++ and Java by the https://github.com/rdfhdt organisation.
It also cannot:
If you need any of the those features, consider using a SPARQL endpoint instead.
use hdt::Hdt;
let file = std::fs::File::open("example.hdt").expect("error opening file");
let hdt = Hdt::new(std::io::BufReader::new(file)).expect("error loading HDT");
// query
let majors = hdt.triples_with_pattern(Some("http://dbpedia.org/resource/Leipzig"), Some("http://dbpedia.org/ontology/major"),None);
println!("{:?}", majors.collect::<Vec<_>>());
You can also use the Sophia adapter to load HDT files and reduce memory consumption of an existing application based on Sophia, which is re-exported as hdt::sophia
:
use hdt::{Hdt,HdtGraph};
use hdt::sophia::api::graph::Graph;
use hdt::sophia::api::term::{IriRef, SimpleTerm, matcher::Any};
let file = std::fs::File::open("dbpedia.hdt").expect("error opening file");
let hdt = Hdt::new(std::io::BufReader::new(file)).expect("error loading HDT");
let graph = HdtGraph::new(hdt);
let s = SimpleTerm::Iri(IriRef::new_unchecked("http://dbpedia.org/resource/Leipzig".into()));
let p = SimpleTerm::Iri(IriRef::new_unchecked("http://dbpedia.org/ontology/major".into()));
let majors = graph.triples_matching(Some(s),Some(p),Any);
If you don't want to pull in the Sophia dependency, you can exclude the adapter:
[dependencies]
hdt = { version = "...", default-features = false }
There is also a runnable example are in the examples folder, which you can run with cargo run --example query
.
See docs.rs/latest/hdt or generate for yourself with cargo doc --no-deps
without disabling default features.
The performance of a query depends on the size of the graph, the type of triple pattern and the size of the result set.
When using large HDT files, make sure to enable the release profile, such as through cargo build --release
, as this can be much faster than using the dev profile.
If you want to optimize the code, you can use a profiler. The provided test data is very small in order to keep the size of the crate down; locally modifying the tests to use a large HDT file returns more meaningful results.
$ cargo test --release
[...]
Running unittests src/lib.rs (target/release/deps/hdt-2b2f139dafe69681)
[...]
$ perf record --call-graph=dwarf target/release/deps/hdt-2b2f139dafe69681 hdt::tests::triples
$ perf script > /tmp/test.perf
Then go to https://profiler.firefox.com/ and open /tmp/test.perf
.
cargo bench --bench criterion
tests/resources
cargo bench --bench iai
tests/resources
The separate benchmark suite compares the performance of this and some other RDF libraries.
If you have a problem with the software, want to report a bug or have a feature request, please use the issue tracker. If have a different type of request, feel free to send an email to Konrad.
If you use this library in your research, please cite our paper in the Journal of Open Source Software. We also provide a CITATION.cff file.
@article{hdtrs,
doi = {10.21105/joss.05114},
year = {2023},
publisher = {The Open Journal},
volume = {8},
number = {84},
pages = {5114},
author = {Konrad Höffner and Tim Baccaert},
title = {hdt-rs: {A} {R}ust library for the {H}eader {D}ictionary {T}riples binary {RDF} compression format},
journal = {Journal of Open Source Software}
}
Höffner et al., (2023). hdt-rs: A Rust library for the Header Dictionary Triples binary RDF compression format. Journal of Open Source Software, 8(84), 5114, https://doi.org/10.21105/joss.05114
We are happy to receive pull requests.
Please use cargo fmt
before committing, make sure that cargo test
succeeds and that the code compiles on the stable and nightly toolchain both with and without the "sophia" feature active.
cargo clippy
should not report any warnings.