Crate cano

Crate cano 

Source
Expand description

ยงCano: Type-Safe Async Workflow Engine

Cano is a high-performance orchestration engine designed for building resilient, self-healing systems in Rust. Unlike simple task queues, Cano uses Finite State Machines (FSM) to define strict, type-safe transitions between processing steps.

It excels at managing complex lifecycles where state transitions matter:

  • Data Pipelines: ETL jobs with parallel processing (Split/Join) and aggregation.
  • AI Agents: Multi-step inference chains with shared context and memory.
  • Background Systems: Scheduled maintenance, periodic reporting, and distributed cron jobs.

ยง๐Ÿš€ Quick Start

Choose between Task (simple) or Node (structured) for your processing logic, create a MemoryStore for sharing data, then run your workflow. Every Node automatically works as a Task for maximum flexibility.

ยง๐ŸŽฏ Core Concepts

ยงFinite State Machines (FSM)

Workflows in Cano are state machines. You define your states as an enum, and register handlers (Task or Node) for each state. The engine ensures type safety and manages transitions between states.

ยงTasks & Nodes - Your Processing Units

Two approaches for implementing processing logic:

  • Task trait: Simple interface with single run() method - perfect for prototypes and simple operations
  • Node trait: Structured three-phase lifecycle with built-in retry strategies - ideal for production workloads

Every Node automatically implements Task, providing seamless interoperability and upgrade paths.

ยงParallel Execution (Split/Join)

Run tasks concurrently and join results with strategies like All, Any, Quorum, or PartialResults. This allows for powerful patterns like scatter-gather, redundant execution, and latency optimization.

ยงStore - Share Data Between Processing Units

Use MemoryStore to pass data around your workflow. Store different types of data using key-value pairs, and retrieve them later with type safety. All values are wrapped in std::borrow::Cow for memory efficiency.

ยง๐Ÿ—๏ธ Processing Lifecycle

Task: Single run() method with full control over execution flow

Node: Three-phase lifecycle for structured processing:

  1. Prep: Load data, validate inputs, setup resources
  2. Exec: Core processing logic (with automatic retry support)
  3. Post: Store results, cleanup, determine next action

This structure makes nodes predictable and easy to reason about, while tasks provide maximum flexibility.

ยง๐Ÿ“š Module Overview

  • task: The Task trait for simple, flexible processing logic

    • Single run() method for maximum simplicity
    • Perfect for prototypes and straightforward operations
  • node: The Node trait for structured processing logic

    • Built-in retry logic and error handling
    • Three-phase lifecycle (prep, exec, post)
    • Fluent configuration API via TaskConfig
  • workflow: Core workflow orchestration

    • Workflow for state machine-based workflows with Split/Join support
  • [scheduler] (optional scheduler feature): Advanced workflow scheduling

    • [Scheduler] for managing multiple flows with cron support
    • Time-based and event-driven scheduling
  • store: Thread-safe key-value storage helpers for pipeline data sharing

  • error: Comprehensive error handling system

    • CanoError for categorized error types
    • CanoResult type alias for convenient error handling
    • Rich error context and conversion traits

ยง๐Ÿ“ˆ Getting Started

  1. Start with the examples: Run cargo run --example basic_node_usage
  2. Read the module docs: Each module has detailed documentation and examples
  3. Check the benchmarks: Run cargo bench --bench node_performance to see performance
  4. Join the community: Contribute features, fixes, or feedback

ยงPerformance Characteristics

  • Low Latency: Minimal overhead with direct execution
  • High Throughput: Direct execution for maximum performance
  • Memory Efficient: Scales with data size, not concurrency settings
  • Async I/O: Efficient async operations with tokio runtime

Re-exportsยง

pub use error::CanoError;
pub use error::CanoResult;
pub use node::DefaultNodeResult;
pub use node::DefaultParams;
pub use node::DynNode;
pub use node::Node;
pub use store::KeyValueStore;
pub use store::MemoryStore;
pub use task::DefaultTaskParams;
pub use task::DynTask;
pub use task::RetryMode;
pub use task::Task;
pub use task::TaskConfig;
pub use task::TaskObject;
pub use task::TaskResult;
pub use workflow::JoinConfig;
pub use workflow::JoinStrategy;
pub use workflow::SplitResult;
pub use workflow::SplitTaskResult;
pub use workflow::StateEntry;
pub use workflow::Workflow;

Modulesยง

error
Error Handling - Clear, Actionable Error Messages
node
Node API - Structured Workflow Processing
prelude
Simplified imports for common usage patterns
store
Key-Value Store Helpers for Processing Pipelines
task
Task API - Simplified Workflow Interface
workflow
Workflow API - Build Workflows with Split/Join Support