FACT - Fast Augmented Context Tools
FACT (Fast Augmented Context Tools) is a high-performance context processing engine for Rust, designed for AI applications that require intelligent caching, cognitive templates, and blazing-fast data transformation.
Features
- 🚀 High Performance: Sub-100ms processing with intelligent caching
- 🧠 Cognitive Templates: Pre-built templates for common AI patterns
- 💾 Smart Caching: Multi-tier caching with automatic eviction
- 🔧 Flexible Processing: Transform, analyze, filter, and aggregate data
- 🛡️ Type Safe: Full Rust type safety with serde integration
- ⚡ Async First: Built on Tokio for concurrent processing
- 📊 Built-in Benchmarking: Performance monitoring and optimization
Installation
Add FACT to your Cargo.toml:
[]
= "1.0.0"
Or install the CLI tool:
Quick Start
Library Usage
use ;
use json;
async
CLI Usage
# Initialize FACT configuration
# Process data with a template
# List available templates
# Run performance benchmark
# Show cache statistics
Built-in Templates
FACT comes with several pre-configured templates:
- analysis-basic: Statistical and pattern analysis
- pattern-detection: Detect patterns in structured data
- data-aggregation: Aggregate numerical data
- quick-transform: Fast data transformation for caching
Creating Custom Templates
use ;
let template = new
.name
.description
.add_tag
.add_step
.build;
Performance
FACT is designed for high-performance scenarios:
- Cache hit latency: < 25ms
- Cache miss latency: < 100ms
- Memory efficient with automatic eviction
- Concurrent processing with Tokio
Run benchmarks to test on your system:
Advanced Features
Custom Cache Configuration
use ;
use Duration;
let config = FactConfig ;
let fact = with_config;
Async Processing Pipeline
use ;
let items = vec!;
let results: = iter
.map
.buffer_unordered
.collect
.await;
Integration with AI Systems
FACT is designed to work seamlessly with AI and LLM applications:
// Use FACT as a preprocessing layer for LLM input
let processed = fact.process.await?;
let llm_input = format!;
// Cache LLM responses with FACT
let cache_key = "llm_response_xyz";
if let Some = fact.get_cached
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
FACT (Fast Augmented Context Tools) is designed to revolutionize how AI applications handle context and caching, providing a fast, efficient alternative to traditional RAG systems.