1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130
// Copyright 2022 Datafuse Labs.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//! [Hadoop Distributed File System (HDFS™)](https://hadoop.apache.org/) support.
//!
//! A distributed file system that provides high-throughput access to application data.
//!
//! # Features
//!
//! HDFS support needs to enable feature `services-hdfs`.
//!
//! # Configuration
//!
//! - `root`: Set the work dir for backend.
//! - `name_node`: Set the name node for backend.
//!
//! Refer to [`Builder`]'s public API docs for more information.
//!
//! # Environment
//!
//! - `OPENDAL_HDFS_ROOT`
//! - `OPENDAL_HDFS_NAME_NODE`
//!
//! HDFS also needs some environment set correctly.
//!
//! - `JAVA_HOME`: the path to java home, could be found via `java -XshowSettings:properties -version`
//! - `HADOOP_HOME`: the path to hadoop home, opendal relays on this env to discover hadoop jars and set `CLASSPATH` automatically.
//!
//! Most of the time, setting `JAVA_HOME` and `HADOOP_HOME` is enough. But there are some edge cases:
//!
//! - If meeting errors like the following:
//!
//! ```shell
//! error while loading shared libraries: libjvm.so: cannot open shared object file: No such file or directory
//! ```
//!
//! Java's lib are not including in pkg-config find path, please set `LD_LIBRARY_PATH`:
//!
//! ```shell
//! export LD_LIBRARY_PATH=${JAVA_HOME}/lib/server:${LD_LIBRARY_PATH}
//! ```
//!
//! The path of `libjvm.so` could be different, please keep an eye on it.
//!
//! - If meeting errors like the following:
//!
//! ```shell
//! (unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
//! ```
//!
//! `CLASSPATH` is not set correctly or your hadoop installation is incorrect.
//!
//! # Example
//!
//! ### Via Environment
//!
//! Set environment correctly:
//!
//! ```shell
//! export OPENDAL_HDFS_ROOT=/path/to/dir/
//! export OPENDAL_HDFS_NAME_NODE=hdfs://127.0.0.1:9000
//! ```
//!
//! ```no_run
//! use std::sync::Arc;
//!
//! use anyhow::Result;
//! use opendal::Object;
//! use opendal::Operator;
//! use opendal::Scheme;
//!
//! #[tokio::main]
//! async fn main() -> Result<()> {
//! let op: Operator = Operator::from_env(Scheme::Hdfs)?;
//!
//! // Create an object handle to start operation on object.
//! let _: Object = op.object("test_file");
//!
//! Ok(())
//! }
//! ```
//!
//! ### Via Builder
//!
//! ```no_run
//! use std::sync::Arc;
//!
//! use anyhow::Result;
//! use opendal::services::hdfs;
//! use opendal::Accessor;
//! use opendal::Object;
//! use opendal::Operator;
//!
//! #[tokio::main]
//! async fn main() -> Result<()> {
//! // Create fs backend builder.
//! let mut builder = hdfs::Builder::default();
//! // Set the name node for hdfs.
//! builder.name_node("hdfs://127.0.0.1:9000");
//! // Set the root for hdfs, all operations will happen under this root.
//! //
//! // NOTE: the root must be absolute path.
//! builder.root("/tmp");
//!
//! // `Accessor` provides the low level APIs, we will use `Operator` normally.
//! let op: Operator = Operator::new(builder.build()?);
//!
//! // Create an object handle to start operation on object.
//! let _: Object = op.object("test_file");
//!
//! Ok(())
//! }
//! ```
mod backend;
pub use backend::Builder;
mod dir_stream;
mod error;