Crate omega_attention

Crate omega_attention 

Source
Expand description

§Omega Attention - Brain-Like Selective Processing

Implements 39 attention mechanisms inspired by transformer architectures and neuroscience research, plus brain-like attention control systems.

§Features

  • 39 Attention Mechanisms: Flash, Linear, Sparse, Hyperbolic, Graph, Memory-augmented
  • Top-Down Attention: Goal-driven, task-relevant selection
  • Bottom-Up Attention: Stimulus-driven, salience-based capture
  • Working Memory Gating: Input/output/forget gates for WM control
  • Attention Spotlight: Winner-take-all competition

§Architecture

┌─────────────────────────────────────────────────────────────┐
│                   ATTENTION SYSTEM                          │
├─────────────────────────────────────────────────────────────┤
│                                                              │
│  ┌────────────────────┐    ┌────────────────────┐           │
│  │   TOP-DOWN         │    │   BOTTOM-UP        │           │
│  │   (Goal-driven)    │    │   (Salience)       │           │
│  │                    │    │                    │           │
│  │  • Task relevance  │    │  • Novelty         │           │
│  │  • Expected value  │    │  • Contrast        │           │
│  │  • Memory match    │    │  • Motion          │           │
│  └────────┬───────────┘    └────────┬───────────┘           │
│           │                         │                        │
│           └───────────┬─────────────┘                        │
│                       ▼                                      │
│           ┌───────────────────────┐                         │
│           │   ATTENTION CONTROL   │                         │
│           │   (Priority Map)      │                         │
│           └───────────┬───────────┘                         │
│                       ▼                                      │
│           ┌───────────────────────┐                         │
│           │   ATTENTION MECHANISMS│                         │
│           │   (39 types)          │                         │
│           └───────────┬───────────┘                         │
│                       ▼                                      │
│           ┌───────────────────────┐                         │
│           │   WORKING MEMORY      │                         │
│           │   (Gated Access)      │                         │
│           └───────────────────────┘                         │
│                                                              │
└─────────────────────────────────────────────────────────────┘

Re-exports§

pub use mechanisms::AttentionMechanism;
pub use mechanisms::AttentionType;
pub use mechanisms::AttentionOutput;
pub use mechanisms::ScaledDotProductAttention;
pub use mechanisms::FlashAttention;
pub use mechanisms::LinearAttention;
pub use mechanisms::SparseAttention;
pub use mechanisms::HyperbolicAttention;
pub use mechanisms::GraphAttention;
pub use mechanisms::MemoryAugmentedAttention;
pub use mechanisms::MultiHeadAttention;
pub use controller::AttentionController;
pub use controller::AttentionConfig;
pub use controller::PriorityMap;
pub use working_memory::WorkingMemory;
pub use working_memory::WMGate;
pub use working_memory::WorkingMemoryItem;
pub use salience::SalienceMap;
pub use salience::SalienceComputer;
pub use salience::SalienceFeature;

Modules§

controller
Attention Controller
mechanisms
Attention Mechanisms
salience
Salience Computation
working_memory
Working Memory with Gating

Structs§

AttentionState
Current state of attention system
AttentionSystem
Main attention system orchestrating all components

Enums§

AttentionError
Errors that can occur in the attention module

Type Aliases§

Result