Expand description
§FEAGI - Framework for Evolutionary Artificial General Intelligence
FEAGI is a pure neural computation framework for building artificial general intelligence through evolutionary principles. This crate provides the core algorithms without any I/O dependencies.
§Quick Start
[dependencies]
feagi = "0.0.1" # Umbrella crate (default: std + full features)§Feature Flags
§Platform Targets
std(default): Standard Rust (Linux, macOS, Windows, Docker)no_std: RTOS/embedded targets (FreeRTOS, Zephyr, bare-metal)wasm: WebAssembly support
§Component Selection
full(default): All componentscompute: Just NPU + state (no I/O)io: PNS + agent SDK (requires compute)
§Individual Components
burst-engine: NPU executionbrain-development: Neurogenesisplasticity: Synaptic learningstate-manager: Runtime stateserialization: Connectome I/Opns: ZMQ/UDP transportagent-sdk: Rust agent library
§Usage Examples
§Full FEAGI (all features)
[dependencies]
feagi = "0.0.1"use feagi::burst_engine::{backend::CPUBackend, RustNPU};
use feagi_npu_runtime::StdRuntime;
// Create NPU
let mut npu =
RustNPU::<StdRuntime, f32, CPUBackend>::new(StdRuntime, CPUBackend::new(), 100_000, 1_000_000, 20)?;
// Run burst
let result = npu.process_burst()?;§Inference Only (no neurogenesis)
[dependencies]
feagi = { version = "0.0.1", features = ["burst-engine", "serialization"] }use feagi::burst_engine::{backend::CPUBackend, DynamicNPU};
use feagi::serialization::load_connectome;
use feagi_npu_runtime::StdRuntime;
// Load pre-trained brain (snapshot usage pending import API refactor)
let _snapshot = load_connectome("brain.connectome")?;
// Create NPU for inference
let mut npu = DynamicNPU::new_f32(StdRuntime, CPUBackend::new(), 100_000, 1_000_000, 20)?;
// Run inference
loop {
let result = npu.process_burst()?;
// ... process results
}§WASM Deployment
[dependencies]
feagi = { version = "0.0.1", features = ["wasm", "compute"], default-features = false }§Architecture
┌─────────────────────────────────────────────────────────┐
│ Foundation: feagi-types │
│ (Neuron, Synapse, CorticalArea) │
└─────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ Infrastructure: feagi-state-manager │
│ (Runtime state, lock-free operations) │
└─────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ Algorithms: burst-engine, brain-development, plasticity │
│ (Pure neural computation, no I/O) │
└─────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ I/O: feagi-io, feagi-agent-sdk │
│ (ZMQ/UDP transport, agent communication) │
└─────────────────────────────────────────────────────────┘§Platform Support
- ✅ Linux (x86_64, ARM64)
- ✅ macOS (Intel, Apple Silicon)
- ✅ Windows (x86_64)
- ✅ Docker / Kubernetes
- ✅ RTOS (FreeRTOS, Zephyr) via
no_std - ✅ WebAssembly via
wasm
§Performance
- State reads: 5-20 nanoseconds (lock-free atomic)
- Burst cycle: 100-1000 Hz (depends on genome size)
- Neurons: Tested up to 10M neurons
- Synapses: Tested up to 100M synapses
§Related Crates
- feagi-data-processing: Foundation data structures
- feagi-io: I/O layer (PNS, agent SDK) - separate repo
- feagi-py: Python bindings - separate repo
- feagi-connector: Python agent SDK - separate repo
§License
Apache-2.0
Re-exports§
pub use feagi_state_manager as state_manager;pub use feagi_npu_burst_engine as burst_engine;pub use feagi_brain_development as bdu;pub use feagi_npu_plasticity as plasticity;pub use feagi_io as io;pub use feagi_agent as agent;
Modules§
- prelude
- Prelude - commonly used types and traits
- serialization
- FEAGI Connectome I/O
- types
- Neural Types Module