Expand description
Self-Optimizing Neural Architecture Search (NAS) System
This module provides an advanced Neural Architecture Search framework that can automatically design optimal neural network architectures for different tasks. It includes multiple search strategies, multi-objective optimization, and meta-learning capabilities for production-ready deployment.
Features:
- Evolutionary search with advanced mutation operators
- Differentiable architecture search (DARTS)
- Progressive search with early stopping
- Multi-objective optimization (accuracy, latency, memory, energy)
- Meta-learning for transfer across domains
- Hardware-aware optimization
- Automated hyperparameter tuning
Structs§
- Architecture
- Neural architecture representation
- Architecture
Metadata - Architecture metadata
- Architecture
Performance - Performance metrics for an architecture
- Best
Practice - Connection
- Connection between layers
- Global
Config - Global architecture configuration
- Hardware
Constraints - Hardware constraints for architecture search
- Layer
Config - Configuration for a single layer
- Layer
Parameters - Layer-specific parameters
- Meta
Knowledge Base - Neural
Architecture Search - Neural Architecture Search engine
- Optimization
Objectives - Multi-objective optimization targets
- Performance
Predictor - Progressive
Search Controller - Resource
Usage - Search
Config - Search
History - Search
Progress - Search
Results - Search results
- Search
Space - Search space configuration for neural architectures
- Search
Statistics - Transfer
Mapping
Enums§
- Activation
Type - Activation function types
- Architecture
Pattern - Architecture patterns extracted from meta-knowledge
- Connection
Type - Connection pattern types
- Hardware
Platform - Target hardware platforms
- Layer
Type - Neural network layer types
- NASStrategy
- Neural Architecture Search strategies
- Optimizer
Type - Optimizer types for training