oxify-model
The Brain - Domain Models for OxiFY LLM Workflow Orchestration
Overview
oxify-model provides the core data structures for defining and managing LLM workflows as directed acyclic graphs (DAGs). These models are the foundation of OxiFY's Type-Safe Workflow Engine, leveraging Rust's type system to guarantee workflow correctness at compile time.
Status: ✅ Production Ready with Advanced Optimization Features Part of: OxiFY Enterprise Architecture (Codename: Absolute Zero)
Features
- 🎯 Type-Safe Workflow Definition: DAG-based workflows with compile-time guarantees
- 💰 Cost Estimation: Predict LLM execution costs before running
- ⏱️ Time Prediction: Estimate workflow execution time with confidence scores
- 🔍 Workflow Optimization: Automatic detection of optimization opportunities
- ⚡ Batch Analysis: Identify parallelization opportunities for faster execution
- 🧠 Variable Optimization: Minimize memory usage and eliminate unnecessary copies
- 📊 Comprehensive Analytics: Track execution stats, performance metrics, and trends
- 🔐 Enterprise Features: Versioning, checkpointing, secrets management, webhooks
Key Types
Workflow
Represents a complete workflow definition:
Node Types
Execution Context
Tracks workflow execution state:
Usage
Creating a Workflow
use *;
let mut workflow = new;
// Add start node
let start = new;
workflow.add_node;
// Add LLM node
let llm = new;
workflow.add_node;
// Add end node
let end = new;
workflow.add_node;
// Connect nodes
workflow.add_edge;
workflow.add_edge;
// Validate workflow
workflow.validate?;
Serialization
All types are serializable to/from JSON:
// To JSON
let json = to_string_pretty?;
// From JSON
let workflow: Workflow = from_str?;
Node Configuration Details
LlmConfig
VectorConfig
ScriptConfig
Validation
Workflows can be validated for:
- At least one Start node exists
- All edge references point to valid nodes
- No cycles in the graph (DAG property)
if let Err = workflow.validate
Optimization Features
Cost Estimation
Predict workflow execution costs before running:
use ;
let workflow = new
.start
.llm
.retriever
.end
.build;
let estimate = estimate;
println!;
// Output:
// Total Cost: $0.0125
// LLM: $0.0120 | Vector: $0.0005
// Tokens: 1250 input, 1000 output (2250 total)
Time Prediction
Estimate execution time with confidence scores:
use TimePredictor;
let predictor = new;
let estimate = predictor.predict;
println!;
// Output:
// Estimated Time: 2s - 30s (avg: 8s)
// Critical Path: Start → Generate → Search → End
// Confidence: 75%
Workflow Optimization
Automatically detect optimization opportunities:
use WorkflowOptimizer;
let report = analyze;
println!;
// Output:
// Optimization Score: 65%
// Potential Savings: $0.0083 | 2500ms
// Opportunities: 2 parallelization, 1 redundant nodes
for suggestion in report.high_priority_suggestions
Batch Analysis
Identify parallelization opportunities:
use BatchAnalyzer;
let plan = analyze;
println!;
// Output:
// Batch Execution Plan:
// Total Nodes: 10 | Batches: 5 | Max Parallelism: 4
// Speedup Factor: 2.5x | Efficiency: 60%
Variable Optimization
Minimize memory usage and eliminate unnecessary copies:
use VariableOptimizer;
let analysis = analyze;
println!;
// Output:
// Variable Optimization Analysis:
// Total Variable Flows: 8 | Tracked Variables: 5
// Optimization Opportunities: 3 | Unnecessary Copies: 1
// Estimated Memory Savings: 12 KB
Performance
All optimization features are benchmarked for performance:
- Cost Estimation: ~4.7μs for 10 nodes, ~63μs for 100 nodes
- Time Prediction: ~4.3μs for 10 nodes, ~52μs for 100 nodes
- Optimization Analysis: ~20μs for 10 nodes, ~192μs for 100 nodes
Testing
- 177 tests covering all modules
- Property-based tests with
proptest - Comprehensive integration tests
- Zero warnings with
cargo clippy
See Also
oxify-engine: DAG execution engineoxify-api: REST API for workflow managementoxify-cli: Local workflow runner and development tool