Module quantum_transformer

Module quantum_transformer 

Source
Expand description

Quantum Transformer Architectures

This module implements quantum transformer models with quantum attention mechanisms, position encoding, and multi-head attention for processing quantum and classical data in transformer-style architectures.

Structs§

AttentionOutput
Attention computation result
QuantumAttentionInfo
Quantum attention information
QuantumFeedForward
Quantum feedforward network
QuantumMultiHeadAttention
Quantum multi-head attention module
QuantumPositionEncoding
Quantum position encoding module
QuantumTransformer
Main quantum transformer model
QuantumTransformerConfig
Quantum transformer model configuration
QuantumTransformerLayer
Single quantum transformer layer

Enums§

ActivationType
Activation function types for quantum networks
PositionEncodingType
Position encoding types for quantum transformers
QuantumAttentionType
Types of quantum attention mechanisms

Functions§

create_causal_mask
Helper function to create causal attention mask
create_padding_mask
Helper function to create padding mask