Skip to main content

Module attention

Module attention 

Source
Expand description

Attention Mechanism — dynamic weighting of consciousness layers.

Computes an attention vector over 7 layers based on query context: query length, emotional energy, session depth, pattern confidence, cache hit rate, archetype match score.

Tracks which attention distributions lead to good results (user stops searching) vs bad results (user immediately re-queries). Over time, learns which layer weights work best.

Binary format: attention.bin (ATT1)

Structs§

AttentionOutcome
Recorded outcome for learning.
AttentionSignals
Input signals for computing attention weights.
AttentionState
AttentionVector
Computed attention weights for a single recall.

Constants§

LAYER_NAMES
NUM_LAYERS
Layer indices: Hebbian, Mirror, Resonance, Archetype, Emotional, ThoughtGraph, PredictiveCache