docs.rs failed to build fs-hdfs-0.1.2
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
fs-hdfs
It's based on the version 0.0.4
of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.
Current Status
- All libhdfs FFI APIs are ported.
- Safe Rust wrapping APIs to cover most of the libhdfs APIs except those related to zero-copy read.
- Compared to hdfs-rs, it removes the lifetime in HdfsFs, which will be more friendly for others to depend on.
Documentation
- [API documentation] (https://yahonanjing.github.io/fs-hdfs)
Requirements
- Hadoop compiled with native library (i.e., maven profile
-Pnative
)- Please refer to https://github.com/apache/hadoop/blob/trunk/BUILDING.txt if you need more description.
- The C related files are from the branch
3.3.1
of hadoop repository. For rust usage, a few changes are also applied.
Usage
Add this to your Cargo.toml:
[]
= "0.1.1"
fs-hdfs uses libhdfs. Firstly, we need to add library path to find the libhdfs. An example for MacOS,
Here, $HADOOP_HOME
and $JAVA_HOME
need to be specified and exported.
Since our dependent libhdfs is JNI native implementation, it requires the proper CLASSPATH
. An example,
Testing
The test also requires the CLASSPATH
. In case that the java class of org.junit.Assert
can't be found. Refine the $CLASSPATH
as follows:
Then you can run
Example
use HdfsFs;
let fs: HdfsFs = new.ok.unwrap;
match fs.mkdir ;