fx-durable-ga
A durable, auditable genetic algorithm optimization library built on PostgreSQL.
What is this?
fx-durable-ga is designed for long-running genetic algorithm optimizations where durability and auditability matter more than framework speed. It's built for scenarios where fitness evaluations are expensive (seconds to hours) and you need:
- Crash recovery: Resume optimizations exactly where they left off
- Full audit trails: Every evaluation, generation, and decision is recorded
- Concurrent execution: Multiple workers can contribute to the same optimization
- Parameter tracking: Complete history of what was tried
When to use this
Perfect for:
- AI model hyperparameter optimization
- Neural architecture search
- Feature selection for ML models
- Any optimization where evaluation takes much longer than the GA framework overhead
Not ideal for:
- Fast, in-memory optimizations (use traditional GA libraries)
- Real-time applications requiring sub-second responses
- Simple parameter sweeps (use grid search)
How it works
The library uses PostgreSQL as both storage and coordination layer:
- Durable state: All populations, genotypes, and evaluations persist in the database
- Event-driven: Optimizations progress through database events, enabling crash recovery
- Deduplication: Identical genomes are never evaluated twice for the same request
- Smart initialization: Latin Hypercube Sampling for better space coverage
- Fitness-based termination: Automatic stopping when target fitness thresholds are reached
Network latency to the database is the primary overhead, but this is negligible when fitness evaluations take seconds or longer.
Quick start
use ;
use BoxFuture;
// 1. Define your optimization target
// 2. Implement genetic encoding
// 3. Implement fitness evaluation
;
// 4. Start optimization
let service = bootstrap.await?
..await?
.build;
service.new_optimization_request.await?;
Documentation and examples
- API documentation: Run
cargo doc --openfor comprehensive API docs - Examples: See
examples/point_search.rsfor a complete working example - Code documentation: All public APIs include detailed usage examples
Development setup
- Set up PostgreSQL and configure
DATABASE_URLwith your database URL - Run migrations using sqlx cli
sqlx migrate run. If you run into issues with missing relations from jobs or events, use offline mode and prepare query cache. - Generate SQLx cache:
cargo sqlx prepare(optional, for offline compilation) - Run examples:
cargo run --example point-search - Run tests:
cargo test - Generate test coverage:
cargo llvm-cov --html --output-dir coverage
Migrations
Running cargo sqlx prepare may require setting search path options for the DATABASE_URL variable. For example
DATABASE_URL="postgres://postgres:postgres@localhost:5432/fx-durable-ga?options=-c%20search_path%3Dfx_mq_jobs%2Cfx_event_bus%2Cfx_durable_ga"
Keeping the search path parameters may however make tests fail, so some juggling is currently required.
Contributing
Contributions are welcome! Please feel free to submit pull requests for bug fixes, improvements, or new features.