Expand description
§depthai-rs
Experimental Rust bindings + safe-ish wrapper for Luxonis DepthAI-Core v3.1.0+ (supports v3.1.0, v3.2.0, v3.2.1, and latest).
§API Overview
§Device platforms
depthai-rs supports multiple DepthAI hardware platforms:
DevicePlatform::Rvc2- RVC2-based devices (OAK-D, OAK-D-Lite, etc.)DevicePlatform::Rvc3- RVC3-based devicesDevicePlatform::Rvc4- RVC4-based devices (latest generation)
Query the device platform:
let device = Device::new()?;
let platform = device.platform()?;§Device features
// Query connected cameras
let cameras = device.connected_cameras()?;
// Control IR laser dot projector (on supported devices)
device.set_ir_laser_dot_projector_intensity(0.3)?;
// Check if device is still connected
if device.is_connected() {
// Device is still connected, but note that it could disconnect between
// this check and any subsequent operations, so always handle errors.
}§Device ownership
DepthAI device connections are typically exclusive. depthai-rs mirrors the common C++ pattern of sharing one device connection:
Device::new()opens/returns a device handle.Device::clone()/Device::try_clone()creates another handle to the same underlying connection.Pipeline::new().with_device(&device).build()?binds a pipeline to an existing device connection (recommended).Pipeline::start()starts the pipeline using its associated device connection.
§Creating nodes
depthai-rs provides multiple ways to create device nodes:
§Generic API (type-safe)
// Nodes without parameters
let stereo = pipeline.create::<StereoDepthNode>()?;
let rgbd = pipeline.create::<RgbdNode>()?;
// Nodes with parameters
let camera = pipeline.create_with::<CameraNode, _>(CameraBoardSocket::CamA)?;§By C++ class name
let node = pipeline.create_node("dai::node::StereoDepth")?;§Composite nodes
Use the #[depthai_composite] macro to bundle multiple nodes:
#[depthai_composite]
pub struct CameraStereoBundle {
pub left: CameraNode,
pub right: CameraNode,
pub stereo: StereoDepthNode,
}
impl CameraStereoBundle {
pub fn new(pipeline: &Pipeline) -> Result<Self> {
let left = pipeline.create_with::<CameraNode, _>(CameraBoardSocket::CamB)?;
let right = pipeline.create_with::<CameraNode, _>(CameraBoardSocket::CamC)?;
let stereo = pipeline.create::<StereoDepthNode>()?;
// Link nodes
left.raw()?.link(&stereo.left()?)?;
right.raw()?.link(&stereo.right()?)?;
Ok(Self { left, right, stereo })
}
}
// Use as a regular node
let bundle = pipeline.create::<CameraStereoBundle>()?;§Host nodes
depthai-rs supports custom processing nodes written in Rust:
§HostNode
Synchronous processing node using the #[depthai_host_node] macro:
#[depthai_host_node]
struct FrameLogger;
impl FrameLogger {
fn process(&mut self, group: &MessageGroup) -> Option<Buffer> {
if let Ok(Some(frame)) = group.get_frame("in") {
println!("Frame: {}x{}", frame.width(), frame.height());
}
None
}
}
let host = pipeline.create_host_node(FrameLogger)?;§ThreadedHostNode
Asynchronous processing node with its own thread using #[depthai_threaded_host_node]:
#[depthai_threaded_host_node]
struct FrameProcessor {
input: Input,
}
impl FrameProcessor {
fn run(&mut self, ctx: &ThreadedHostNodeContext) {
while ctx.is_running() {
if let Ok(frame) = self.input.get_frame() {
// Process frame
}
}
}
}
let host = pipeline.create_threaded_host_node(|node| {
let input = node.create_input(Some("in"))?;
Ok(FrameProcessor { input })
})?;§RerunHostNode (optional rerun feature)
Visualize data streams using Rerun:
let host = pipeline.create_with::<RerunHostNode, _>(RerunHostNodeConfig {
viewer: RerunViewer::Web(RerunWebConfig {
// Don't auto-open browser in remote/container environments
open_browser: false,
..Default::default()
}),
..Default::default()
})?;
out.link(&host.input("in")?)?;Requires the rerun feature and Tokio runtime support.
§Node linking
Link nodes by output to input, with optional port names:
// Simple linking
camera_out.link(&stereo.left()?)?;
// With explicit port names
depth_out.link_to(&align, Some("input"))?;
color_out.link_to(&align, Some("inputAlignTo"))?;§Camera configuration
Configure camera outputs with detailed options:
let out = camera.request_output(CameraOutputConfig {
size: (640, 400),
frame_type: Some(ImageFrameType::RGB888i),
resize_mode: ResizeMode::Crop,
fps: Some(30.0),
enable_undistortion: Some(true),
})?;Supported frame types include: RGB888i, BGR888i, GRAY8, NV12, NV21, YUV420p, RAW8, RAW10, RAW12, and more.
Available camera board sockets: CamA, CamB, CamC, CamD, CamE, CamF.
§Common types and enums
The common module provides frequently used types:
ImageFrameType: Frame pixel formats (RGB888i, GRAY8, NV12, etc.)ResizeMode: How to resize images (Crop, Stretch, Letterbox)CameraBoardSocket: Physical camera ports on the deviceCameraSensorType: Camera sensor types (Color, Mono, Thermal, ToF)
§Stereo depth
Configure stereo depth processing:
let stereo = pipeline.create::<StereoDepthNode>()?;
stereo.set_default_profile_preset(StereoPresetMode::Robotics);
stereo.set_left_right_check(true);
stereo.set_subpixel(true);
stereo.enable_distortion_correction(true);§RGBD and point clouds
Generate aligned RGB-D data and point clouds:
let rgbd = pipeline.create::<RgbdNode>()?;
rgbd.set_depth_unit(DepthUnit::Meter);
rgbd.build()?;
// Link color and depth inputs
color_out.link_to(rgbd.as_node(), Some("inColorSync"))?;
depth_out.link_to(rgbd.as_node(), Some("inDepthSync"))?;
// Get outputs
let q_pcl = rgbd.as_node().output("pcl")?.create_queue(2, false)?;
let q_rgbd = rgbd.as_node().output("rgbd")?.create_queue(2, false)?;
// Retrieve data
if let Some(pcl) = q_pcl.try_next_pointcloud()? {
for point in pcl.points() {
// Access point.x, point.y, point.z, point.r, point.g, point.b
}
}§Video encoding
Encode camera frames to H.264 or H.265:
let encoder = pipeline.create::<VideoEncoderNode>()?;
encoder.set_default_profile_preset(30.0, VideoEncoderProfile::H264High);
encoder.set_rate_control_mode(VideoEncoderRateControlMode::Cbr);
encoder.set_bitrate_kbps(5000);
encoder.set_keyframe_frequency(30);
// Link camera to encoder
camera_out.link(&encoder.input()?)?;
// Get encoded output
let q = encoder.bitstream()?.create_queue(30, false)?;§Image manipulation
Transform and process images on-device:
let manip = pipeline.create::<ImageManipNode>()?;
// Configure manipulation via initial config
let mut config = manip.initial_config()?;
config.add_crop_xywh(100, 100, 640, 480)
.add_rotate_deg(90.0)
.set_frame_type(ImageFrameType::RGB888i);
// Link camera to manipulator
camera_out.link(&manip.inputImage()?)?;§Error handling
All fallible operations return Result<T, DepthaiError>:
match Device::new() {
Ok(device) => {
println!("Device connected");
}
Err(e) => {
eprintln!("Failed to connect: {}", e);
}
}§Pipeline introspection
Query pipeline structure and connections:
// Get all nodes in the pipeline
let nodes = pipeline.all_nodes()?;
println!("Pipeline has {} nodes", nodes.len());
// Get all connections
let connections = pipeline.connections()?;
for conn in connections {
println!("Connection: {} -> {}", conn.output_name, conn.input_name);
}§Procedural macros
depthai-rs provides several procedural macros to simplify node creation:
§#[native_node_wrapper]
Wraps native DepthAI nodes with type-safe Rust interfaces:
#[native_node_wrapper(
native = "dai::node::Camera",
inputs(inputControl, mockIsp),
outputs(raw)
)]
pub struct CameraNode {
node: crate::pipeline::Node,
}§#[depthai_host_node]
Creates synchronous host nodes:
#[depthai_host_node]
struct MyProcessor;
impl MyProcessor {
fn process(&mut self, group: &MessageGroup) -> Option<Buffer> {
// Process messages
None
}
}§#[depthai_threaded_host_node]
Creates asynchronous threaded host nodes:
#[depthai_threaded_host_node]
struct MyThreadedProcessor {
input: Input,
}
impl MyThreadedProcessor {
fn run(&mut self, ctx: &ThreadedHostNodeContext) {
// Run in dedicated thread
}
}§#[depthai_composite]
Bundles multiple nodes into a composite node:
#[depthai_composite]
pub struct MyComposite {
pub stereo: StereoDepthNode,
pub rgbd: RgbdNode,
}
impl MyComposite {
pub fn new(pipeline: &Pipeline) -> Result<Self> {
let stereo = pipeline.create::<StereoDepthNode>()?;
let rgbd = pipeline.create::<RgbdNode>()?;
Ok(Self { stereo, rgbd })
}
}Re-exports§
pub use error::DepthaiError;pub use error::Result;pub use pipeline::CreateInPipeline;pub use pipeline::CreateInPipelineWith;pub use pipeline::DeviceNode;pub use pipeline::DeviceNodeWithParams;pub use device::Device;pub use device::DevicePlatform;pub use pipeline::Pipeline;pub use output::Output;pub use output::Input;pub use pointcloud::Point3fRGBA;pub use pointcloud::PointCloudData;pub use queue::Datatype;pub use queue::DatatypeEnum;pub use queue::InputQueue;pub use queue::MessageQueue;pub use queue::QueueCallbackHandle;pub use image_manip::Backend as ImageManipBackend;pub use image_manip::Colormap;pub use image_manip::ImageManipConfig;pub use image_manip::ImageManipNode;pub use image_manip::ImageManipResizeMode;pub use image_manip::PerformanceMode as ImageManipPerformanceMode;pub use image_align::ImageAlignNode;pub use encoded_frame::EncodedFrame;pub use encoded_frame::EncodedFrameProfile;pub use encoded_frame::EncodedFrameQueue;pub use encoded_frame::EncodedFrameType;pub use rgbd::DepthUnit;pub use rgbd::RgbdData;pub use rgbd::RgbdNode;pub use stereo_depth::PresetMode as StereoPresetMode;pub use stereo_depth::StereoDepthNode;pub use video_encoder::VideoEncoderNode;pub use video_encoder::VideoEncoderProfile;pub use video_encoder::VideoEncoderRateControlMode;pub use host_node::HostNode;pub use host_node::HostNodeImpl;pub use host_node::MessageGroup;pub use host_node::Buffer;pub use threaded_host_node::ThreadedHostNode;pub use threaded_host_node::ThreadedHostNodeImpl;pub use threaded_host_node::ThreadedHostNodeContext;pub use depthai_sys as bindings;
Modules§
- camera
- common
- device
- encoded_
frame - error
- host_
node - image_
align - image_
manip - output
- pipeline
- pointcloud
- queue
- rgbd
- stereo_
depth - threaded_
host_ node - video_
encoder
Attribute Macros§
- depthai_
composite - Attribute macro for defining composite nodes in Rust.
- depthai_
host_ node - Attribute macro for defining Rust host nodes.
- depthai_
threaded_ host_ node - Attribute macro for defining threaded host nodes in Rust.
- native_
node_ wrapper - Wrap a native DepthAI node that is created via
Pipeline::create_node_by_name("ClassName").