fs-hdfs 0.1.2

libhdfs binding library and safe Rust APIs
docs.rs failed to build fs-hdfs-0.1.2
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.

fs-hdfs

It's based on the version 0.0.4 of http://hyunsik.github.io/hdfs-rs to provide libhdfs binding library and rust APIs which safely wraps libhdfs binding APIs.

Current Status

  • All libhdfs FFI APIs are ported.
  • Safe Rust wrapping APIs to cover most of the libhdfs APIs except those related to zero-copy read.
  • Compared to hdfs-rs, it removes the lifetime in HdfsFs, which will be more friendly for others to depend on.

Documentation

Requirements

  • Hadoop compiled with native library (i.e., maven profile -Pnative)
  • The C related files are from the branch 3.3.1 of hadoop repository. For rust usage, a few changes are also applied.

Usage

Add this to your Cargo.toml:

[dependencies]
fs-hdfs = "0.1.1"

fs-hdfs uses libhdfs. Firstly, we need to add library path to find the libhdfs. An example for MacOS,

export DYLD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_HOME/jre/lib/server

Here, $HADOOP_HOME and $JAVA_HOME need to be specified and exported.

Since our dependent libhdfs is JNI native implementation, it requires the proper CLASSPATH. An example,

export CLASSPATH=$CLASSPATH:`hadoop classpath`

Testing

The test also requires the CLASSPATH. In case that the java class of org.junit.Assert can't be found. Refine the $CLASSPATH as follows:

export CLASSPATH=$CLASSPATH:`hadoop classpath`:$HADOOP_HOME/share/hadoop/tools/lib/*

Then you can run

cargo test

Example

use hdfs::hdfs::HdfsFs;

let fs: HdfsFs = HdfsFs::new("hdfs://localhost:8020/").ok().unwrap();
match fs.mkdir("/data") {
    Ok(_) => { println!("/data has been created") },
    Err(_)  => { panic!("/data creation has failed") }
};