neuromod 0.1.0

Reward-modulated spiking neural networks: LIF/Izhikevich neurons, STDP, and neuromodulator dynamics (dopamine, cortisol, acetylcholine)
Documentation

neuromod

Reward-modulated spiking neural networks with biologically plausible learning.

neuromod gives you LIF and Izhikevich neuron models, Poisson spike encoding, spike-timing-dependent plasticity (STDP), and a full neuromodulator system (dopamine, cortisol, acetylcholine) — all in a zero-unsafe, serde-ready Rust crate.


Quick start

[dependencies]
neuromod = "0.1"
use neuromod::engine::SpikingInferenceEngine;
use neuromod::modulators::TelemetryFrame;

fn main() {
    let mut engine = SpikingInferenceEngine::new();

    // Feed one tick of hardware (or simulated) telemetry
    let frame = TelemetryFrame {
        vddcr_gfx_v: 1.0,      // core voltage (V)
        power_w: 250.0,          // board power draw (W)
        hashrate_mh: 0.010,      // work output (MH/s or any normalised unit)
        gpu_temp_c: 72.0,        // die temperature (°C)
        gpu_clock_mhz: 2640.0,   // core clock (MHz)
    };
    engine.step(&frame);

    println!("dopamine     = {:.3}", engine.modulators.dopamine);
    println!("cortisol     = {:.3}", engine.modulators.cortisol);
    println!("acetylcholine= {:.3}", engine.modulators.acetylcholine);
}

Core types

Type Module Description
LifNeuron neurons Leaky Integrate-and-Fire — the fast, reactive workhorse
IzhikevichNeuron neurons Two-variable biophysical model; supports bursting, chattering, etc.
PoissonEncoder neurons Converts a scalar intensity into a stochastic spike train
apply_stdp stdp Exponential STDP window, gated by a dopamine_lr reward signal
synaptic_scaling stdp L1-norm weight normalization (homeostatic plasticity)
NeuroModulators modulators Dopamine / cortisol / acetylcholine / tempo state
TelemetryFrame modulators Hardware snapshot that drives modulator levels each tick
RewardEvent modulators Discrete WorkAccepted / BreakthroughFound / SourceSwitch events
SpikingInferenceEngine engine Full 8-LIF + 5-Iz SNN with STDP and homeostatic adaptation
FaultClass diagnostics Hardware fault classification enum with canonical error codes
FpgaMetrics diagnostics WNS parser for Vivado timing summary reports

How STDP works

Classic exponential STDP implements Hebb's rule with a timer:

Δw =  A⁺ · exp(−Δt / τ⁺)   if pre fires before post  → LTP (potentiate)
Δw = −A⁻ · exp( Δt / τ⁻)   if post fires before pre  → LTD  (depress)

neuromod multiplies every Δw by a dopamine_lr scalar so that reward (high dopamine) gates how much the network learns on each step — zero dopamine means zero weight change regardless of timing.

use neuromod::neurons::LifNeuron;
use neuromod::stdp::apply_stdp;

let mut neurons = vec![LifNeuron::new()];
neurons[0].weights = vec![0.5];
neurons[0].last_spike_time = 10;

let pre_times = vec![5_i64]; // pre fired before post → LTP
apply_stdp(&mut neurons, &pre_times, 0.8); // 80% dopamine gate
assert!(neurons[0].weights[0] > 0.5);

Persistence (save / load)

All public types derive serde::Serialize and serde::Deserialize, so you can checkpoint and restore a running engine in five lines:

// Save
engine.save_parameters("checkpoint.json")?;

// Restore
let mut engine2 = SpikingInferenceEngine::new();
engine2.load_parameters("checkpoint.json")?;

Or serialize individual neurons directly:

let json = serde_json::to_string_pretty(&engine.neurons)?;
let restored: Vec<neuromod::neurons::LifNeuron> = serde_json::from_str(&json)?;

vs. spiking_neural_networks

spiking_neural_networks (v0.24, ~29k downloads) focuses on biophysical fidelity — Hodgkin-Huxley conductance models, ion channels, detailed compartmental neurons.

neuromod focuses on production-ready reinforcement learning:

  • Reward-modulated STDP — dopamine gates every weight update
  • Full neuromodulator state — dopamine, cortisol, acetylcholine, tempo derived from hardware telemetry each tick
  • Homeostatic plasticity — threshold adaptation + L1 synaptic scaling prevent runaway excitation out of the box
  • First-class persistenceserde + JSON roundtrip on every public type

License

Licensed under either of Apache License 2.0 or MIT license at your option.