Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
It's based on the version
0.0.4 of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
- All libhdfs FFI APIs are ported.
- Safe Rust wrapping APIs to cover most of the libhdfs APIs except those related to zero-copy read.
- Compared to hdfs-rs, it removes the lifetime in HdfsFs, which will be more friendly for others to depend on.
- [API documentation] (https://docs.rs/crate/fs-hdfs3)
- The C related files are from the branch
3.1.4of hadoop repository. For rust usage, a few changes are also applied.
- No need to compile the Hadoop native library by yourself. However, the Hadoop jar dependencies are still required.
Add this to your Cargo.toml:
We need to specify
$JAVA_HOME to make Java shared library available for building.
Since our compiled libhdfs is JNI-based implementation,
it requires Hadoop-related classes available through
CLASSPATH. An example,
Also, we need to specify the JVM dynamic library path for the application to load the JVM shared library at runtime.
For jdk8 and macOS, it's
For jdk11 (or later jdks) and macOS, it's
For jdk8 and Centos
For jdk11 (or later jdks) and Centos
The test also requires the
LD_LIBRARY_PATH). In case that the java class of
org.junit.Assert can't be found. Refine the
$CLASSPATH as follows:
$HADOOP_HOME need to be specified and exported.
Then you can run
let fs: = get_hdfs_by_full_path.ok.unwrap;
match fs.mkdir ;