AI Workbench Library
A comprehensive Rust library for AI workbench operations including file processing, intelligent splitting, and AWS Bedrock integration.
Features
- File Discovery: S3-based file discovery and processing
- Intelligent File Splitting: Type-aware chunking for various file formats (text, CSV, JSON, code)
- AWS Bedrock Integration: Model runner for seamless AI model interactions
- Job Processing: Complete workflow orchestration for AI processing tasks
Installation
Add this to your Cargo.toml:
[]
= "0.1.0"
Quick Start
use ;
use BehaviorVersion;
use Client as S3Client;
use Client as BedrockClient;
use Arc;
async
Components
File Discovery
Discover and process files from S3 buckets:
use FileDiscovery;
let discovery = new;
let files = discovery.discover_files.await?;
File Splitting
Intelligent file splitting with type-aware chunking:
use ;
let splitter = with_config;
let chunks = splitter.split_file?;
Model Runner
Direct AWS Bedrock model interactions:
use ModelRunner;
let model_runner = new;
let = model_runner
.invoke_model
.await?;
File Type Support
The library automatically detects and optimally processes:
- Text files: Line-based chunking with context preservation
- CSV/TSV files: Row-based chunking with header preservation
- JSON files: Object/array-based intelligent splitting
- Code files: Syntax-aware chunking (Rust, Python, JavaScript, etc.)
- Binary files: Basic size-based chunking
Requirements
- Rust 1.70+
- AWS credentials configured
- Tokio async runtime
License
MIT License - see LICENSE file for details.
Contributing
Contributions welcome! Please read our contributing guidelines and submit pull requests to our GitHub repository.