Expand description
Attention Mechanism — dynamic weighting of consciousness layers.
Computes an attention vector over 7 layers based on query context: query length, emotional energy, session depth, pattern confidence, cache hit rate, archetype match score.
Tracks which attention distributions lead to good results (user stops searching) vs bad results (user immediately re-queries). Over time, learns which layer weights work best.
Binary format: attention.bin (ATT1)
Structs§
- Attention
Outcome - Recorded outcome for learning.
- Attention
Signals - Input signals for computing attention weights.
- Attention
State - Attention
Vector - Computed attention weights for a single recall.
Constants§
- LAYER_
NAMES - NUM_
LAYERS - Layer indices: Hebbian, Mirror, Resonance, Archetype, Emotional, ThoughtGraph, PredictiveCache