Expand description
PyTorch-like API for quantum machine learning models
This module provides a familiar PyTorch-style interface for building, training, and deploying quantum ML models, making it easier for classical ML practitioners to adopt quantum algorithms.
Structs§
- Cosine
AnnealingLR - Cosine annealing learning rate scheduler
- ExponentialLR
- Exponential learning rate scheduler
- LSTM
State - LSTM cell state
- Memory
Data Loader - Simple in-memory data loader
- Parameter
- Quantum parameter wrapper
- Positional
Encoding - Positional encoding for transformers
- Quantum
Activation - Quantum activation functions
- Quantum
Adaptive AvgPool2d - Adaptive average pooling 2D
- Quantum
AvgPool2d - Average pooling 2D
- QuantumBCE
Loss - Binary Cross Entropy Loss
- QuantumBCE
With Logits Loss - Binary Cross Entropy with Logits Loss
- Quantum
Batch Norm1d - Batch normalization layer
- Quantum
Conv1d - 1D Convolution layer
- Quantum
Conv2d - Quantum convolutional layer
- Quantum
Conv3d - 3D Convolution layer
- Quantum
Cross Entropy Loss - Cross Entropy loss
- Quantum
Dropout - Dropout layer
- Quantum
Dropout2d - Dropout2d for convolutional layers
- Quantum
Embedding - Embedding layer for discrete inputs
- Quantum
Extended Activation - Extended activation layer
- QuantumGRU
- GRU layer
- QuantumKL
DivLoss - Kullback-Leibler Divergence Loss
- Quantum
L1Loss - L1 Loss (Mean Absolute Error)
- QuantumLSTM
- LSTM layer
- Quantum
Layer Norm - Layer normalization
- Quantum
Linear - Quantum linear layer
- QuantumMSE
Loss - Mean Squared Error loss
- Quantum
MaxPool2d - Max pooling 2D
- Quantum
Multihead Attention - Multi-head attention layer
- QuantumNLL
Loss - Negative Log Likelihood Loss
- Quantum
Sequential - Sequential container for quantum modules
- Quantum
Smooth L1Loss - Smooth L1 Loss (Huber Loss)
- Quantum
Trainer - Training utilities
- Quantum
Transformer Encoder Layer - Transformer encoder layer
- ReduceLR
OnPlateau - ReduceLROnPlateau scheduler
- StepLR
- Step learning rate scheduler
- Training
History - Training history
Enums§
- Activation
Type - Activation function types
- Extended
Activation - Extended activation function types
- Init
Type - Parameter initialization types
Traits§
- Data
Loader - Data loader trait
- LRScheduler
- Learning rate scheduler trait
- Quantum
Loss - Loss functions for quantum ML
- Quantum
Module - Base trait for all quantum ML modules
Functions§
- init_
weights - Initialize parameters with specified method