noid
Neural boids. A small neural network replaces Reynolds' three hand-tuned flocking rules (separation, alignment, cohesion) with 1,922 learned parameters.
Each agent perceives its 5 nearest neighbors (topological, not metric) and feeds 24 numbers into a 2-layer MLP that outputs a 2D steering acceleration. No explicit rules. The behavior emerges from the weights.
Usage
use ;
let mut flock = new;
loop
Training is built in:
use Trainer;
let mut trainer = new;
for _ in 0..5000
trainer.brain.save_json;
Weight interpolation:
use Brain;
let chaos = random;
let order = load_json;
let halfway = lerp;
Architecture
obs(24) → Linear(24×32) → SiLU → Linear(32×32) → SiLU → Linear(32×2) → tanh×60
The observation vector:
- Own velocity:
[vx, vy] - Own heading:
[sin θ, cos θ] - 5 nearest neighbors, each:
[Δx, Δy, Δvx, Δvy]
Training
Imitation learning from classic boids. Generate random flock configurations, compute what Reynolds' three rules would do, train the network to match. 3,000 steps is enough.
Performance
tick_fast() uses spatial hashing for O(n) amortized neighbor search and a batched forward pass. The neural inference is a matrix multiplication — one GPU dispatch for the entire flock.
| noid | boid | |
|---|---|---|
| steering (per agent) | 527 ns | 13 ns |
| full tick, n=1024 | 1.9 ms | 1.1 ms |
| full tick, n=4096 | 17 ms | 14 ms |
The steering itself is 40x more expensive, but neighbor search dominates at scale, so the total gap is ~1.2x on CPU. On GPU, the matmul maps directly to shader cores.
License
MIT