Expand description
Graph Self-Supervised Learning (SSL) methods.
Provides contrastive learning and masked autoencoder approaches for learning graph representations without labels.
| Sub-module | Method | Reference |
|---|---|---|
contrastive | GraphCL, SimGRACE, NT-Xent loss | You et al. 2020; Xia 2022 |
masked_autoencoder | GraphMAE with SCE loss | Hou et al. 2022 |
Re-exports§
pub use contrastive::augment_edges;pub use contrastive::augment_features;pub use contrastive::nt_xent_loss;pub use contrastive::simgrace_perturb;pub use contrastive::GraphClConfig;pub use contrastive::ProjectionHead;pub use masked_autoencoder::GraphMae;pub use masked_autoencoder::GraphMaeConfig;pub use pretrain::infonce_loss;pub use pretrain::AttrReconConfig;pub use pretrain::AttrReconConfig as AttributeReconConfig;pub use pretrain::AttributeReconstructionObjective;pub use pretrain::GraphContextConfig;pub use pretrain::GraphContextPretrainer;pub use pretrain::NodeMaskingConfig;pub use pretrain::NodeMaskingPretrainer;
Modules§
- contrastive
- Graph Contrastive Learning: GraphCL (You et al. 2020) and SimGRACE (Xia 2022).
- masked_
autoencoder - Graph Masked Autoencoder (GraphMAE, Hou et al. 2022).
- pretrain
- Graph pre-training strategies: node masking, graph-context contrastive, attribute reconstruction. Graph pre-training strategies for self-supervised learning.