//! Graph Self-Supervised Learning (SSL) methods.
//!
//! Provides contrastive learning and masked autoencoder approaches for
//! learning graph representations without labels.
//!
//! | Sub-module | Method | Reference |
//! |---|---|---|
//! | [`contrastive`] | GraphCL, SimGRACE, NT-Xent loss | You et al. 2020; Xia 2022 |
//! | [`masked_autoencoder`] | GraphMAE with SCE loss | Hou et al. 2022 |
/// Graph pre-training strategies: node masking, graph-context contrastive, attribute reconstruction.
// Contrastive learning
pub use ;
// Masked autoencoder
pub use ;
// Pre-training strategies
pub use ;