cano/lib.rs
1//! # Cano: Simple & Fast Async Workflows in Rust
2//!
3//! Cano is an async workflow engine that makes complex data processing simple. Whether you need
4//! to process one item or millions, Cano provides a clean API with minimal overhead for maximum performance.
5//!
6//! ## 🚀 Quick Start
7//!
8//! Choose between [`Task`] (simple) or [`Node`] (structured) for your processing logic,
9//! create a [`MemoryStore`] for sharing data, then run your workflow. Every [`Node`] automatically
10//! works as a [`Task`] for maximum flexibility.
11//!
12//! ## 🎯 Core Concepts
13//!
14//! ### Tasks & Nodes - Your Processing Units
15//!
16//! **Two approaches for implementing processing logic:**
17//! - [`Task`] trait: Simple interface with single `run()` method - perfect for prototypes and simple operations
18//! - [`Node`] trait: Structured three-phase lifecycle with built-in retry strategies - ideal for production workloads
19//!
20//! **Every [`Node`] automatically implements [`Task`]**, providing seamless interoperability and upgrade paths.
21//!
22//! ### Store - Share Data Between Processing Units
23//!
24//! Use [`MemoryStore`] to pass data around your workflow. Store different types of data
25//! using key-value pairs, and retrieve them later with type safety. All values are
26//! wrapped in `std::borrow::Cow` for memory efficiency.
27//!
28//! ### Custom Logic - Your Business Implementation
29//!
30//! **Choose the right approach for your needs:**
31//! - Implement the [`Task`] trait for simple, single-method processing
32//! - Implement the [`Node`] trait for structured processing with three phases:
33//! Prep (load data, validate inputs), Exec (core processing), and Post (store results, determine next action)
34//!
35//! ## 🏗️ Processing Lifecycle
36//!
37//! **Task**: Single `run()` method with full control over execution flow
38//!
39//! **Node**: Three-phase lifecycle for structured processing:
40//!
41//! 1. **Prep**: Load data, validate inputs, setup resources
42//! 2. **Exec**: Core processing logic (with automatic retry support)
43//! 3. **Post**: Store results, cleanup, determine next action
44//!
45//! This structure makes nodes predictable and easy to reason about, while tasks provide maximum flexibility.
46//!
47//! ## 📚 Module Overview
48//!
49//! - **[`task`]**: The [`Task`] trait for simple, flexible processing logic
50//! - Single `run()` method for maximum simplicity
51//! - Perfect for prototypes and straightforward operations
52//!
53//! - **[`node`]**: The [`Node`] trait for structured processing logic
54//! - Built-in retry logic and error handling
55//! - Three-phase lifecycle (`prep`, `exec`, `post`)
56//! - Fluent configuration API via [`TaskConfig`]
57//!
58//! - **[`workflow`]**: Core workflow orchestration
59//! - [`Workflow`] for state machine-based workflows
60//!
61//! - **[`scheduler`]** (optional `scheduler` feature): Advanced workflow scheduling
62//! - [`Scheduler`] for managing multiple flows with cron support
63//! - Time-based and event-driven scheduling
64//!
65//! - **[`store`]**: Thread-safe key-value storage helpers for pipeline data sharing
66//! - [`MemoryStore`] for in-memory data sharing
67//! - [`KeyValueStore`] trait for custom storage backends
68//!
69//! - **[`error`]**: Comprehensive error handling system
70//! - [`CanoError`] for categorized error types
71//! - [`CanoResult`] type alias for convenient error handling
72//! - Rich error context and conversion traits
73//!
74//! ## 📈 Getting Started
75//!
76//! 1. **Start with the examples**: Run `cargo run --example basic_node_usage`
77//! 2. **Read the module docs**: Each module has detailed documentation and examples
78//! 3. **Check the benchmarks**: Run `cargo bench --bench node_performance` to see performance
79//! 4. **Join the community**: Contribute features, fixes, or feedback
80//!
81//! ## Performance Characteristics
82//!
83//! - **Low Latency**: Minimal overhead with direct execution
84//! - **High Throughput**: Direct execution for maximum performance
85//! - **Memory Efficient**: Scales with data size, not concurrency settings
86//! - **Async I/O**: Efficient async operations with tokio runtime
87
88pub mod error;
89pub mod node;
90pub mod store;
91pub mod task;
92pub mod workflow;
93
94#[cfg(feature = "scheduler")]
95pub mod scheduler;
96
97#[cfg(all(test, feature = "tracing"))]
98mod tracing_tests;
99
100// Core public API - simplified imports
101pub use error::{CanoError, CanoResult};
102pub use node::{DefaultNodeResult, DefaultParams, DynNode, Node};
103pub use store::{KeyValueStore, MemoryStore};
104pub use task::{DefaultTaskParams, DynTask, RetryMode, Task, TaskConfig, TaskObject};
105pub use workflow::{
106 ConcurrentWorkflow, ConcurrentWorkflowBuilder, ConcurrentWorkflowStatus, WaitStrategy,
107 Workflow, WorkflowBuilder, WorkflowResult,
108};
109
110#[cfg(feature = "scheduler")]
111pub use scheduler::{FlowInfo, Scheduler};
112
113// Convenience re-exports for common patterns
114pub mod prelude {
115 //! Simplified imports for common usage patterns
116 //!
117 //! Use `use cano::prelude::*;` to import the most commonly used types and traits.
118
119 pub use crate::{
120 CanoError, CanoResult, ConcurrentWorkflow, ConcurrentWorkflowBuilder,
121 ConcurrentWorkflowStatus, DefaultNodeResult, DefaultParams, DefaultTaskParams,
122 KeyValueStore, MemoryStore, Node, RetryMode, Task, TaskConfig, TaskObject, WaitStrategy,
123 Workflow, WorkflowBuilder, WorkflowResult,
124 };
125
126 #[cfg(feature = "scheduler")]
127 pub use crate::{FlowInfo, Scheduler};
128
129 // Re-export async_trait for convenience
130 pub use async_trait::async_trait;
131}