edge_impulse_runner/
lib.rs

1//! # Edge Impulse
2//!
3//! A Rust library for running inference with Edge Impulse Linux models (EIM) and uploading data to
4//! Edge Impulse. This crate provides safe and easy-to-use interfaces for:
5//! - Running machine learning models on Linux and MacOS
6//! - Uploading training, testing and anomaly data to Edge Impulse projects
7//!
8//! ## Features
9//!
10//! ### Inference
11//! - Run Edge Impulse models (.eim files) on Linux and MacOS
12//! - Support for different model types:
13//!   - Classification models
14//!   - Object detection models
15//! - Support for different sensor types:
16//!   - Camera
17//!   - Microphone
18//!   - Accelerometer
19//!   - Positional sensors
20//! - Continuous classification mode support
21//! - Debug output option
22//!
23//! ### Data Ingestion
24//! - Upload data to Edge Impulse projects
25//! - Support for multiple data categories:
26//!   - Training data
27//!   - Testing data
28//!   - Anomaly data
29//! - Handle various file formats:
30//!   - Images (JPG, PNG)
31//!   - Audio (WAV)
32//!   - Video (MP4, AVI)
33//!   - Sensor data (CBOR, JSON, CSV)
34//!
35//! ## Quick Start Examples
36//!
37//! ### Basic Classification
38//! ```no_run
39//! use edge_impulse_runner::{EimModel, InferenceResult};
40//!
41//! fn main() -> Result<(), Box<dyn std::error::Error>> {
42//!     // Create a new model instance
43//!     let mut model = EimModel::new("path/to/model.eim")?;
44//!
45//!     // Prepare normalized features (e.g., image pixels, audio samples)
46//!     let features: Vec<f32> = vec![0.1, 0.2, 0.3];
47//!
48//!     // Run inference
49//!     let result = model.infer(features, None)?;
50//!
51//!     // Process results
52//!     match result.result {
53//!         InferenceResult::Classification { classification } => {
54//!             println!("Classification: {:?}", classification);
55//!         }
56//!         InferenceResult::ObjectDetection {
57//!             bounding_boxes,
58//!             classification,
59//!         } => {
60//!             println!("Detected objects: {:?}", bounding_boxes);
61//!             if !classification.is_empty() {
62//!                 println!("Classification: {:?}", classification);
63//!             }
64//!         }
65//!         InferenceResult::VisualAnomaly {
66//!             visual_anomaly_grid,
67//!             visual_anomaly_max,
68//!             visual_anomaly_mean,
69//!             anomaly,
70//!         } => {
71//!             let (normalized_anomaly, normalized_max, normalized_mean, normalized_regions) =
72//!                 model.normalize_visual_anomaly(
73//!                     anomaly,
74//!                     visual_anomaly_max,
75//!                     visual_anomaly_mean,
76//!                     &visual_anomaly_grid.iter()
77//!                         .map(|bbox| (bbox.value, bbox.x as u32, bbox.y as u32, bbox.width as u32, bbox.height as u32))
78//!                         .collect::<Vec<_>>()
79//!                 );
80//!             println!("Anomaly score: {:.2}%", normalized_anomaly * 100.0);
81//!             println!("Maximum score: {:.2}%", normalized_max * 100.0);
82//!             println!("Mean score: {:.2}%", normalized_mean * 100.0);
83//!             for (value, x, y, w, h) in normalized_regions {
84//!                 println!("Region: score={:.2}%, x={}, y={}, width={}, height={}",
85//!                     value * 100.0, x, y, w, h);
86//!             }
87//!         }
88//!     }
89//!     Ok(())
90//! }
91//! ```
92//!
93//! ### Data Upload
94//! ```no_run
95//! use edge_impulse_runner::ingestion::{Category, Ingestion, UploadOptions};
96//!
97//! # async fn run() -> Result<(), Box<dyn std::error::Error>> {
98//! // Create client with API key
99//! let ingestion = Ingestion::new("your-api-key".to_string());
100//!
101//! // Upload a file
102//! let result = ingestion
103//!     .upload_file(
104//!         "data.jpg",
105//!         Category::Training,
106//!         Some("label".to_string()),
107//!         Some(UploadOptions {
108//!             disallow_duplicates: true,
109//!             add_date_id: true,
110//!         }),
111//!     )
112//!     .await?;
113//! # Ok(())
114//! # }
115//! ```
116//!
117//! ## Architecture
118//!
119//! ### Inference Protocol
120//! The Edge Impulse Inference Runner uses a Unix socket-based IPC mechanism to communicate
121//! with the model process. The protocol is JSON-based and follows a request-response pattern
122//! for model initialization, classification requests, and error handling.
123//!
124//! ### Ingestion API
125//! The ingestion module interfaces with the Edge Impulse Ingestion API over HTTPS, supporting
126//! both data and file endpoints for uploading samples to Edge Impulse projects.
127//!
128//! ## Prerequisites
129//!
130//! Some functionality (particularly video capture) requires GStreamer to be installed:
131//! - **macOS**: Install both runtime and development packages from gstreamer.freedesktop.org
132//! - **Linux**: Install required packages (libgstreamer1.0-dev and related packages)
133//!
134//! ## Error Handling
135//!
136//! The crate uses the `EimError` type to provide detailed error information:
137//! ```no_run
138//! use edge_impulse_runner::{EimModel, EimError};
139//!
140//! // Match on model creation
141//! match EimModel::new("model.eim") {
142//!     Ok(mut model) => {
143//!         // Match on classification
144//!         match model.infer(vec![0.1, 0.2, 0.3], None) {
145//!             Ok(result) => println!("Success!"),
146//!             Err(EimError::InvalidInput(msg)) => println!("Invalid input: {}", msg),
147//!             Err(e) => println!("Other error: {}", e),
148//!         }
149//!     },
150//!     Err(e) => println!("Failed to load model: {}", e),
151//! }
152//! ```
153//!
154//! ## Modules
155//!
156//! - `error`: Error types and handling
157//! - `inference`: Model management and inference functionality
158//! - `ingestion`: Data upload and project management
159//! - `types`: Common types and parameters
160
161pub mod error;
162pub mod inference;
163pub mod ingestion;
164pub mod types;
165
166pub use inference::messages::{InferenceResponse, InferenceResult};
167pub use inference::EimModel;
168
169pub use error::EimError;
170pub use types::BoundingBox;
171pub use types::ModelParameters;
172pub use types::ProjectInfo;
173pub use types::SensorType;
174pub use types::TimingInfo;