entrenar 0.7.8

Training & Optimization library with autograd, LoRA, quantization, and model merging
Documentation
//! # Entrenar: Training & Optimization Library
//!
//! Entrenar provides a tape-based autograd engine with optimizers, LoRA/QLoRA,
//! quantization (QAT/PTQ), model merging (TIES/DARE/SLERP), and knowledge distillation.
//!
//! ## Architecture
//!
//! - **autograd**: Tape-based automatic differentiation
//! - **optim**: Optimizers (SGD, Adam, AdamW)
//! - **lora**: Low-rank adaptation with QLoRA support
//! - **quant**: Quantization-aware training and post-training quantization
//! - **merge**: Model merging methods
//! - **distill**: Knowledge distillation
//! - **config**: Declarative YAML configuration
//! - **train**: High-level training loop
//! - **io**: Model saving and loading (JSON, YAML formats)
//! - **hf_pipeline**: HuggingFace model fetching and distillation
//! - **citl**: Compiler-in-the-Loop training with RAG-based fix suggestions (feature-gated)
//! - **efficiency**: Cost tracking, device detection, and performance benchmarking
//! - **eval**: Model evaluation framework with metrics, comparison, and drift detection
//! - **sovereign**: Air-gapped deployment and distribution packaging
//! - **research**: Academic research artifacts, citations, and archive deposits
//! - **ecosystem**: PAIML stack integrations (Batuta, Realizar, Ruchy)
//! - **dashboard**: Real-time training monitoring and WASM bindings
//! - **yaml_mode**: Declarative YAML Mode Training (v1.0 spec)
//! - **transformer**: Transformer layers with autograd support
//! - **moe**: Mixture of Experts sparse routing layer
//! - **decision**: Decision pattern storage and CITL trainer (GH-28, GH-29)
//! - **cli**: Command-line interface handlers
//! - **finetune**: Fine-tuning pipeline with Popperian QA (SPEC-FT-001)

// Contract assertions from YAML (pv codegen)
#[macro_use]
#[allow(unused_macros)]
mod generated_contracts;

// Fallback macros for contracts not yet in build.rs codegen
// (embedding-lookup-v1 was added in provable-contracts 0.2 but
// entrenar's build.rs hasn't been updated to generate it yet)
#[cfg(not(feature = "__has_embedding_contract"))]
macro_rules! contract_pre_embedding_lookup {
    () => {{}};
    ($input:expr) => {{
        let _ = &$input;
    }};
}
#[cfg(not(feature = "__has_embedding_contract"))]
#[allow(unused_macros)]
macro_rules! contract_post_embedding_lookup {
    ($result:expr) => {{
        let _ = &$result;
    }};
}

pub mod aprender_compat;
pub mod autograd;
#[cfg(feature = "citl")]
pub mod citl;
pub mod cli;
pub mod config;
pub mod dashboard;
pub mod decision;
pub mod distill;
pub mod ecosystem;
pub mod efficiency;
pub mod eval;
#[cfg(not(target_arch = "wasm32"))]
pub mod finetune;
pub mod generative;
#[cfg(not(target_arch = "wasm32"))]
pub mod gpu;
#[cfg(all(not(target_arch = "wasm32"), feature = "hub"))]
pub mod hf_pipeline;
pub mod inference;
pub mod integrity;
pub mod io;
pub mod lora;
pub mod merge;
pub mod moe;
pub mod monitor;
pub mod numerical;
pub mod optim;
pub mod pipeline;
pub mod prune;
pub mod quality;
pub mod quant;
pub mod research;
pub mod run;
pub mod safety;
pub mod search;
pub mod server;
pub mod sovereign;
pub mod sovereign_array;
pub mod staging;
pub mod storage;
pub mod tokenizer;
pub mod trace;
pub mod tracking;
pub mod train;
pub mod training;
pub mod transformer;
pub mod yaml_mode;

pub mod error;

// Re-export commonly used types
pub use autograd::{backward, Context, Tensor};
pub use error::{Error, Result};