Crates.io | fs-hdfs3 |
lib.rs | fs-hdfs3 |
version | 0.1.12 |
source | src |
created_at | 2022-10-21 08:25:12.870628 |
updated_at | 2023-09-07 10:49:53.412431 |
description | libhdfs binding library and safe Rust APIs |
homepage | https://github.com/datafusion-contrib/fs-hdfs |
repository | https://github.com/datafusion-contrib/fs-hdfs |
max_upload_size | |
id | 693290 |
size | 323,241 |
It's based on the version 0.0.4
of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
3.1.4
of hadoop repository. For rust usage, a few changes are also applied.Add this to your Cargo.toml:
[dependencies]
fs-hdfs3 = "0.1.12"
We need to specify $JAVA_HOME
to make Java shared library available for building.
Since our compiled libhdfs is JNI-based implementation,
it requires Hadoop-related classes available through CLASSPATH
. An example,
export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`
Also, we need to specify the JVM dynamic library path for the application to load the JVM shared library at runtime.
For jdk8 and macOS, it's
export DYLD_LIBRARY_PATH=$JAVA_HOME/jre/lib/server
For jdk11 (or later jdks) and macOS, it's
export DYLD_LIBRARY_PATH=$JAVA_HOME/lib/server
For jdk8 and Centos
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server
For jdk11 (or later jdks) and Centos
export LD_LIBRARY_PATH=$JAVA_HOME/lib/server
The test also requires the CLASSPATH
and DYLD_LIBRARY_PATH
(or LD_LIBRARY_PATH
). In case that the java class of org.junit.Assert
can't be found. Refine the $CLASSPATH
as follows:
export CLASSPATH=$CLASSPATH:`hadoop classpath --glob`:$HADOOP_HOME/share/hadoop/tools/lib/*
Here, $HADOOP_HOME
need to be specified and exported.
Then you can run
cargo test
use std::sync::Arc;
use hdfs::hdfs::{get_hdfs_by_full_path, HdfsFs};
let fs: Arc<HdfsFs> = get_hdfs_by_full_path("hdfs://localhost:8020/").ok().unwrap();
match fs.mkdir("/data") {
Ok(_) => { println!("/data has been created") },
Err(_) => { panic!("/data creation has failed") }
};