Expand description
Advanced Neural Network Debugging Utilities
This module provides specialized debugging utilities for modern neural network architectures, with particular focus on transformer models, attention mechanisms, and large-scale training scenarios.
Structs§
- Attention
Debug Config - Configuration for attention debugging
- Attention
Debugger - Advanced attention mechanism debugger for transformer architectures
- Attention
Distribution - Distribution characteristics of attention weights
- Attention
Evolution - Attention evolution across layers
- Attention
Head Analysis - Analysis of individual attention head behavior
- Attention
Map - Attention map for a specific layer and head
- Cross
Layer Analysis - Cross-layer attention analysis
- Head
Consistency - Head consistency analysis across layers
- Layer
Attention Analysis - Analysis results for a transformer layer’s attention mechanism
- Model
Attention Summary - Model-level attention summary
- Pattern
Progression - Pattern progression analysis
- Redundancy
Analysis - Analysis of attention head redundancy
- Transformer
Attention Analysis - Complete transformer attention analysis results
- Transformer
Debug Config - Configuration for transformer debugging
- Transformer
Debugger - Transformer-specific debugging utilities
Enums§
- Attention
Health Status - Overall attention health status
- Attention
Pattern - Attention pattern types
- Evolution
Type - Types of attention evolution patterns
- Head
Specialization Type - Types of attention head specializations commonly found in transformers