omega-meta-sona
Self-Optimizing Neural Architecture (META-SONA) with evolutionary search, MCTS-based architecture discovery, and multi-objective fitness evaluation.
Part of the ExoGenesis-Omega cognitive architecture.
Overview
omega-meta-sona is the intelligence design engine for ExoGenesis Omega. While SONA optimizes weights within a fixed architecture, META-SONA optimizes the architecture itself. It discovers, evaluates, and evolves cognitive architectures using Monte Carlo Tree Search (MCTS), Proximal Policy Optimization (PPO), and multi-objective fitness functions.
META-SONA enables AI systems to design better AI systems—a key capability for recursive self-improvement and open-ended intelligence evolution.
Features
- Architecture Search: MCTS for exploring the space of possible architectures
- Hyperparameter Optimization: PPO for fine-tuning architecture parameters
- Multi-Objective Fitness: Evaluate capability, efficiency, alignment, and novelty
- Intelligence Factory: High-level API for creating and evolving intelligences
- Lineage Tracking: Full ancestry and evolution history
- Parallel Evaluation: Concurrent fitness assessment for speed
- Type-Safe Design: Strongly typed architecture representation
Installation
Add this to your Cargo.toml:
[]
= "0.1.0"
Quick Start
use ;
async
Core Concepts
Architecture Space
META-SONA explores a vast space of possible architectures defined by:
- Paradigms: Neural, Symbolic, Quantum, Biological, Hybrid
- Substrates: Digital, Biological, Social, Cosmic
- Components: Layers, attention mechanisms, memory modules
- Connections: Skip connections, recurrent loops, hierarchies
- Parameters: Learning rates, layer sizes, activation functions
MCTS Architecture Search
Monte Carlo Tree Search explores the architecture space:
- Selection: Pick promising architecture branch using UCB1
- Expansion: Generate new architecture variations
- Simulation: Evaluate architecture fitness
- Backpropagation: Update node statistics
The search balances exploration (novel architectures) vs. exploitation (refining promising designs).
Multi-Objective Fitness
Architectures are evaluated across four dimensions:
PPO Optimization
After MCTS finds promising architectures, PPO fine-tunes hyperparameters:
- Continuous optimization of architecture parameters
- Generalized Advantage Estimation (GAE) for gradient estimation
- Clipped surrogate objective prevents destructive updates
- Adaptive learning rates for stable convergence
Use Cases
1. Creating Custom Intelligence
use ;
let mut meta_sona = new;
// Specification for a fast, efficient agent
let spec = IntelligenceSpec ;
let intelligence = meta_sona.create_intelligence.await?;
println!;
println!;
2. Evolving an Architecture
use ;
use Utc;
let mut meta_sona = new;
// Start with base architecture
let base = Architecture ;
// Evolve for 5 generations
let evolved = meta_sona.evolve_architecture.await?;
println!;
println!;
println!;
println!;
3. Multi-Objective Optimization
use ;
let mut meta_sona = new;
// Balanced across all objectives
let balanced_spec = IntelligenceSpec ;
let balanced = meta_sona.create_intelligence.await?;
// Capability-focused
let capability_spec = IntelligenceSpec ;
let powerful = meta_sona.create_intelligence.await?;
// Efficiency-focused
let efficiency_spec = IntelligenceSpec ;
let efficient = meta_sona.create_intelligence.await?;
4. Hybrid Architecture Discovery
use ;
let mut meta_sona = new;
// Search for hybrid neural-symbolic architecture
let spec = IntelligenceSpec ;
let hybrid = meta_sona.create_intelligence.await?;
println!;
println!;
5. Lineage Tracking and Analysis
use ;
let mut meta_sona = new;
// Create base architecture
let spec = default;
let gen0 = meta_sona.create_intelligence.await?;
// Evolve through multiple generations
let gen1_arch = meta_sona.evolve_architecture.await?;
let gen2_arch = meta_sona.evolve_architecture.await?;
let gen3_arch = meta_sona.evolve_architecture.await?;
// Analyze lineage
println!;
for in gen3_arch.lineage.iter.enumerate
println!;
println!;
println!;
println!;
println!;
Examples
Intelligence Factory Workflow
use ;
let mut factory = new;
// Create multiple specialized intelligences
let specs = vec!;
let mut intelligences = Vecnew;
for spec in specs
println!;
Custom Fitness Evaluation
use ;
let mut evaluator = new;
// Customize metric weights
evaluator.set_weights;
// Evaluate architecture
let fitness = evaluator.evaluate.await?;
println!;
println!;
println!;
println!;
println!;
Architecture
META-SONA's internal structure:
┌──────────────────────────────────────────┐
│ MetaSONA │
│ - High-level orchestration │
│ - Intelligence creation API │
└────────────┬─────────────────────────────┘
│
▼
┌──────────────────────────────────────────┐
│ IntelligenceFactory │
│ - Specification processing │
│ - Architecture assembly │
│ - Evolution coordination │
└──┬─────────┬──────────────┬──────────────┘
│ │ │
▼ ▼ ▼
┌──────┐ ┌───────┐ ┌──────────────┐
│ MCTS │ │ PPO │ │ Fitness │
│Search│ │ Opt. │ │ Evaluator │
└──────┘ └───────┘ └──────────────┘
│ │ │
▼ ▼ ▼
┌──────────────────────────────────────────┐
│ ArchitectureSpace │
│ - Encoding/decoding │
│ - Mutation operators │
│ - Crossover operators │
└──────────────────────────────────────────┘
Performance
META-SONA performance characteristics:
- MCTS Search: ~100-1000 iterations for good results
- Architecture Evaluation: ~10-100ms per candidate
- PPO Optimization: ~50-200 steps for convergence
- Total Creation Time: 1-10 seconds per intelligence
Parallelization:
- Multiple architecture evaluations in parallel
- Batch PPO updates for efficiency
- Async MCTS simulations
Related Crates
- omega-core - Core architecture types
- omega-loops - Transformative loop for long-term evolution
- omega-memory - Memory of successful architectures
- omega-agentdb - Storage for architecture variants
- omega-persistence - Persisting evolved architectures
- omega-runtime - Runtime deployment of created intelligences
- omega-brain - Unified cognitive architecture
- omega-snn - Neural substrate for evolved architectures
License
Licensed under the MIT License. See LICENSE for details.