Skip to main content

Module attention_visualizer

Module attention_visualizer 

Source
Expand description

Attention pattern visualization for transformer models

This module provides tools to visualize and analyze attention patterns in transformer models, including multi-head attention, cross-attention, and self-attention mechanisms.

Structs§

AttentionAnalysis
Attention pattern analysis results
AttentionFlow
Attention flow between token positions
AttentionHeatmap
Heatmap data for attention visualization
AttentionVisualizer
Attention pattern visualizer for transformer models
AttentionVisualizerConfig
Configuration for attention visualization
AttentionWeights
Attention weights for a specific layer

Enums§

AttentionType
Type of attention mechanism
ColorScheme
Color scheme options for visualization