//! OPT (Open Pre-trained Transformer) model family.
//!
//! Meta's OPT models are decoder-only transformers with:
//! - Learned positional embeddings (+ offset 2 convention).
//! - Pre-norm LayerNorm (in most variants).
//! - ReLU FFN without gating.
//! - Biases in all projections.
//!
//! # References
//!
//! Zhang et al. (2022) — "OPT: Open Pre-trained Transformer Language Models"
//! <https://arxiv.org/abs/2205.01068>
pub use OptConfig;
pub use ;
pub use ;