Module neural_architecture_search

Module neural_architecture_search 

Source
Expand description

Self-Optimizing Neural Architecture Search (NAS) System

This module provides an advanced Neural Architecture Search framework that can automatically design optimal neural network architectures for different tasks. It includes multiple search strategies, multi-objective optimization, and meta-learning capabilities for production-ready deployment.

Features:

  • Evolutionary search with advanced mutation operators
  • Differentiable architecture search (DARTS)
  • Progressive search with early stopping
  • Multi-objective optimization (accuracy, latency, memory, energy)
  • Meta-learning for transfer across domains
  • Hardware-aware optimization
  • Automated hyperparameter tuning

Structs§

Architecture
Neural architecture representation
ArchitectureMetadata
Architecture metadata
ArchitecturePerformance
Performance metrics for an architecture
BestPractice
Connection
Connection between layers
GlobalConfig
Global architecture configuration
HardwareConstraints
Hardware constraints for architecture search
LayerConfig
Configuration for a single layer
LayerParameters
Layer-specific parameters
MetaKnowledgeBase
NeuralArchitectureSearch
Neural Architecture Search engine
OptimizationObjectives
Multi-objective optimization targets
PerformancePredictor
ProgressiveSearchController
ResourceUsage
SearchConfig
SearchHistory
SearchProgress
SearchResults
Search results
SearchSpace
Search space configuration for neural architectures
SearchStatistics
TransferMapping

Enums§

ActivationType
Activation function types
ArchitecturePattern
Architecture patterns extracted from meta-knowledge
ConnectionType
Connection pattern types
HardwarePlatform
Target hardware platforms
LayerType
Neural network layer types
NASStrategy
Neural Architecture Search strategies
OptimizerType
Optimizer types for training