pub struct UniversalModel { /* private fields */ }Expand description
Use UniversalModel::auto() to let TreeBoost analyze your data and pick the best mode.
The analysis result is stored and can be retrieved with UniversalModel::analysis().
Implementations§
Source§impl UniversalModel
impl UniversalModel
Sourcepub fn with_feature_extractor(self, extractor: FeatureExtractor) -> Self
pub fn with_feature_extractor(self, extractor: FeatureExtractor) -> Self
Set feature extractor for LinearThenTree inference
This is called after training to store the feature extraction configuration (which columns to exclude, auto-exclusion settings). This ensures consistent feature extraction during inference for the linear component.
Sourcepub fn feature_extractor(&self) -> Option<&FeatureExtractor>
pub fn feature_extractor(&self) -> Option<&FeatureExtractor>
Get the feature extractor (if set)
Sourcepub fn train(
dataset: &BinnedDataset,
config: UniversalConfig,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn train( dataset: &BinnedDataset, config: UniversalConfig, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train a UniversalModel on binned data
§Arguments
dataset- Binned training dataconfig- Training configuration (fully serializable and persisted)loss_fn- Loss function for computing gradients during training
§Important Notes
- The loss function is NOT persisted in the saved model. It is used only during training to compute gradients. The trained model is loss-function-agnostic and will work correctly with any data processed with the same preprocessing.
- The config is fully persisted and can be exported via [
config()] or saved to JSON for inspection and reuse. - For reproducibility, store your loss function choice separately if needed.
§Example
// Train with MSE loss
let model = UniversalModel::train(&dataset, config, &MseLoss)?;
// Save config and model
let config_json = serde_json::to_string_pretty(model.config())?;
std::fs::write("config.json", config_json)?;
model.save("model.rkyv")?;
// Later: Load and use (loss function is already baked into the model)
let loaded = UniversalModel::load("model.rkyv")?;
let preds = loaded.predict(&test_dataset)?; // No need to specify loss againSourcepub fn train_with_raw_features(
dataset: &BinnedDataset,
raw_features: &[f32],
config: UniversalConfig,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn train_with_raw_features( dataset: &BinnedDataset, raw_features: &[f32], config: UniversalConfig, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train LinearThenTree with raw features (recommended for best accuracy)
For LinearThenTree mode, passing raw (unbinned) features significantly improves the linear model’s accuracy. Without raw features, LTT uses bin-center approximations which lose precision that linear models need.
§Arguments
dataset- Binned dataset (for tree training)raw_features- Original features, row-major f32 array (num_rows * num_features)config- Training configurationloss_fn- Loss function
§Example
let model = UniversalModel::train_with_raw_features(
&binned_dataset,
&scaled_features, // Original StandardScaler'd features
config,
&MseLoss,
)?;Sourcepub fn train_with_linear_feature_selection(
dataset: &BinnedDataset,
raw_features: &[f32],
linear_feature_indices: &[usize],
config: UniversalConfig,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn train_with_linear_feature_selection( dataset: &BinnedDataset, raw_features: &[f32], linear_feature_indices: &[usize], config: UniversalConfig, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train LinearThenTree with feature selection for linear model
This allows using a curated subset of features for the linear model while trees use all features. This can improve linear generalization by excluding meaningless features (like row IDs) from linear.
§Arguments
dataset- Binned dataset (for tree training with all features)raw_features- All features, row-major f32 arraylinear_feature_indices- Which feature indices to use for linear modelconfig- Training configurationloss_fn- Loss function
Sourcepub fn auto(dataset: &BinnedDataset, loss_fn: &dyn LossFunction) -> Result<Self>
pub fn auto(dataset: &BinnedDataset, loss_fn: &dyn LossFunction) -> Result<Self>
Train with automatic mode selection
This is TreeBoost’s “smart” entry point. It:
- Analyzes your dataset (lightweight probes on subsamples)
- Picks the best boosting mode with confidence score
- Trains the model with optimal settings
- Stores the analysis for inspection
§Example
use treeboost::{UniversalModel, MseLoss};
let model = UniversalModel::auto(&dataset, &MseLoss)?;
// See what mode was selected and why
println!("Mode: {:?}", model.mode());
println!("Confidence: {:?}", model.selection_confidence());
println!("{}", model.analysis_report().unwrap());§When to Use
Use auto() when:
- You’re not sure which mode is best for your data
- You want TreeBoost to explain its decision
- You want a simple one-liner that “just works”
Use train() when:
- You know the best mode for your data
- You need fine-grained control over configuration
- You’re running benchmarks and want deterministic mode
Sourcepub fn auto_with_config(
dataset: &BinnedDataset,
config: UniversalConfig,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn auto_with_config( dataset: &BinnedDataset, config: UniversalConfig, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train with automatic mode selection and custom configuration
Like auto(), but lets you customize other settings (num_rounds, tree config, etc.).
The mode will be overridden by the analysis recommendation.
§Example
let config = UniversalConfig::new()
.with_num_rounds(200)
.with_learning_rate(0.05);
let model = UniversalModel::auto_with_config(&dataset, config, &MseLoss)?;Sourcepub fn auto_with_analysis_config(
dataset: &BinnedDataset,
config: UniversalConfig,
analysis_config: AnalysisConfig,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn auto_with_analysis_config( dataset: &BinnedDataset, config: UniversalConfig, analysis_config: AnalysisConfig, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train with automatic mode selection and custom analysis configuration
Full control over both model config and analysis settings.
§Example
let config = UniversalConfig::new().with_num_rounds(200);
let analysis_config = AnalysisConfig::fast(); // Quick analysis
let model = UniversalModel::auto_with_analysis_config(
&dataset, config, analysis_config, &MseLoss
)?;Sourcepub fn train_with_selection(
dataset: &BinnedDataset,
config: UniversalConfig,
selection: ModeSelection,
loss_fn: &dyn LossFunction,
) -> Result<Self>
pub fn train_with_selection( dataset: &BinnedDataset, config: UniversalConfig, selection: ModeSelection, loss_fn: &dyn LossFunction, ) -> Result<Self>
Train using a ModeSelection strategy
This is the most flexible entry point, supporting:
ModeSelection::Auto- Automatic analysis and selectionModeSelection::AutoWithConfig(config)- Auto with custom analysisModeSelection::Fixed(mode)- Explicit mode specification
§Example
// Auto mode
let model = UniversalModel::train_with_selection(
&dataset, config, ModeSelection::Auto, &MseLoss
)?;
// Fixed mode
let model = UniversalModel::train_with_selection(
&dataset, config, ModeSelection::Fixed(BoostingMode::LinearThenTree), &MseLoss
)?;Sourcepub fn extract_raw_features(dataset: &BinnedDataset) -> Vec<f32>
pub fn extract_raw_features(dataset: &BinnedDataset) -> Vec<f32>
Extract raw feature values from BinnedDataset using bin-center approximation
⚠️ Note: This is a fallback method with accuracy limitations. For best results,
use FeatureExtractor with the original DataFrame instead. See the
feature_extractor module documentation for a detailed comparison.
This method approximates raw values by using the midpoint of bin boundaries.
While functional, this loses precision compared to extracting from the original
DataFrame with FeatureExtractor.
Returns row-major f32 array for linear model training.
§When to Use
- Only when you have a BinnedDataset but not the original DataFrame
- When using UniversalModel directly (advanced usage)
- When slight accuracy loss is acceptable
§Alternative (Recommended)
use treeboost::dataset::feature_extractor::FeatureExtractor;
let extractor = FeatureExtractor::new();
let (raw_features, num_features) = extractor.extract(&df, "target")?;
// Use raw_features with train_with_raw_features()Extract raw features from bins (lossy approximation)
Deprecated approach: This method is kept for compatibility but delegates
to BinnedDataset::extract_raw_features_from_bins() for consistency across
the codebase. For best accuracy in LinearThenTree mode, pass actual raw features
to training instead of relying on bin-center approximation.
§Historical Note
Previous implementation used bin-center approximation (midpoint between boundaries) for improved accuracy. Current implementation uses bin split values for consistency. The difference is negligible in practice, and users needing high accuracy should pass raw features directly rather than relying on either approximation.
Sourcepub fn predict(&self, dataset: &BinnedDataset) -> Vec<f32>
pub fn predict(&self, dataset: &BinnedDataset) -> Vec<f32>
Predict for all rows in dataset
Sourcepub fn predict_with_raw_features(
&self,
dataset: &BinnedDataset,
raw_features: &[f32],
) -> Vec<f32>
pub fn predict_with_raw_features( &self, dataset: &BinnedDataset, raw_features: &[f32], ) -> Vec<f32>
Predict for all rows using raw features (recommended for LinearThenTree)
For LinearThenTree mode, using raw (unbinned) features for the linear model gives significantly better accuracy than the bin-center approximation.
§Arguments
dataset- Binned dataset (used for tree predictions)raw_features- Original features, row-major f32 array (num_rows * num_features)
§Note
For PureTree and RandomForest, raw_features is ignored (trees use binned data).
Sourcepub fn predict_linear_only(
&self,
dataset: &BinnedDataset,
raw_features: &[f32],
) -> Result<Vec<f32>>
pub fn predict_linear_only( &self, dataset: &BinnedDataset, raw_features: &[f32], ) -> Result<Vec<f32>>
Predict using only linear component (LinearThenTree mode only)
Returns predictions from base + linear model, without tree contribution. Useful for comparing linear-only vs full LinearThenTree performance.
Sourcepub fn predict_row(&self, dataset: &BinnedDataset, row_idx: usize) -> f32
pub fn predict_row(&self, dataset: &BinnedDataset, row_idx: usize) -> f32
Predict for a single row
Sourcepub fn mode(&self) -> BoostingMode
pub fn mode(&self) -> BoostingMode
Get the boosting mode
Sourcepub fn config(&self) -> &UniversalConfig
pub fn config(&self) -> &UniversalConfig
Get training configuration
Sourcepub fn base_prediction(&self) -> f32
pub fn base_prediction(&self) -> f32
Get base prediction
For PureTree, this delegates to GBDTModel. For LinearThenTree, returns the original base prediction (GBDTModel was trained on residuals). For RandomForest, returns the stored base prediction.
Sourcepub fn has_linear(&self) -> bool
pub fn has_linear(&self) -> bool
Check if model has linear component
Sourcepub fn linear_booster(&self) -> Option<&LinearBooster>
pub fn linear_booster(&self) -> Option<&LinearBooster>
Get linear booster reference (if present)
Sourcepub fn gbdt_model(&self) -> Option<&GBDTModel>
pub fn gbdt_model(&self) -> Option<&GBDTModel>
Get underlying GBDTModel (for PureTree and LinearThenTree modes)
Sourcepub fn trees(&self) -> &[Tree]
pub fn trees(&self) -> &[Tree]
Get trees (only for RandomForest mode; PureTree/LinearThenTree use GBDTModel)
Sourcepub fn num_features(&self) -> usize
pub fn num_features(&self) -> usize
Get number of features
Sourcepub fn analysis(&self) -> Option<&DatasetAnalysis>
pub fn analysis(&self) -> Option<&DatasetAnalysis>
Get the dataset analysis that led to mode selection (if auto mode was used)
Returns Some(analysis) if the model was trained with auto() or train_with_selection(Auto).
Returns None if a fixed mode was specified.
§Example
let model = UniversalModel::auto(&dataset, &MseLoss)?;
if let Some(analysis) = model.analysis() {
println!("Linear R²: {:.2}", analysis.linear_r2);
println!("Tree gain: {:.2}", analysis.tree_gain);
println!("Noise floor: {:.2}", analysis.noise_floor);
}Sourcepub fn selection_confidence(&self) -> Option<Confidence>
pub fn selection_confidence(&self) -> Option<Confidence>
Get the confidence in the mode selection (if auto mode was used)
Returns the confidence level from the analysis:
High: Very clear signal, strongly recommend this modeMedium: Reasonable signal, this mode is likely bestLow: Weak signal, consider validating with cross-validation
Returns None if a fixed mode was specified.
Sourcepub fn was_auto_selected(&self) -> bool
pub fn was_auto_selected(&self) -> bool
Check if the mode was automatically selected
Sourcepub fn analysis_report(&self) -> Option<AnalysisReport<'_>>
pub fn analysis_report(&self) -> Option<AnalysisReport<'_>>
Get a formatted analysis report (if auto mode was used)
Returns a human-readable report explaining:
- Dataset characteristics (linear signal, tree gain, noise floor)
- Mode scores for each option
- Why the selected mode was chosen
- Alternative modes to consider
§Example
let model = UniversalModel::auto(&dataset, &MseLoss)?;
if let Some(report) = model.analysis_report() {
println!("{}", report);
}Sourcepub fn analysis_summary(&self) -> Option<String>
pub fn analysis_summary(&self) -> Option<String>
Sourcepub fn predict_with_intervals(
&self,
dataset: &BinnedDataset,
) -> Result<(Vec<f32>, Vec<f32>, Vec<f32>)>
pub fn predict_with_intervals( &self, dataset: &BinnedDataset, ) -> Result<(Vec<f32>, Vec<f32>, Vec<f32>)>
Predict with conformal prediction intervals
Returns (predictions, lower_bounds, upper_bounds) for uncertainty quantification.
§Note
Only supported in PureTree mode (delegates to GBDTModel). LinearThenTree and RandomForest modes return an error.
Sourcepub fn conformal_quantile(&self) -> Option<f32>
pub fn conformal_quantile(&self) -> Option<f32>
Get the calibrated conformal quantile (if available)
Sourcepub fn predict_proba(&self, dataset: &BinnedDataset) -> Result<Vec<f32>>
pub fn predict_proba(&self, dataset: &BinnedDataset) -> Result<Vec<f32>>
Binary classification: predict probabilities
Returns probabilities in [0, 1] for binary classification. Requires model trained with binary log loss.
Sourcepub fn predict_class(
&self,
dataset: &BinnedDataset,
threshold: f32,
) -> Result<Vec<u32>>
pub fn predict_class( &self, dataset: &BinnedDataset, threshold: f32, ) -> Result<Vec<u32>>
Binary classification: predict classes
Returns 0 or 1 based on threshold (default: 0.5).
Sourcepub fn is_multiclass(&self) -> bool
pub fn is_multiclass(&self) -> bool
Check if model is multi-class
Sourcepub fn get_num_classes(&self) -> usize
pub fn get_num_classes(&self) -> usize
Get number of classes (1 for regression, 2+ for classification)
Sourcepub fn predict_proba_multiclass(
&self,
dataset: &BinnedDataset,
) -> Result<Vec<Vec<f32>>>
pub fn predict_proba_multiclass( &self, dataset: &BinnedDataset, ) -> Result<Vec<Vec<f32>>>
Multi-class classification: predict probabilities for each class
Returns Vec<Vec
Sourcepub fn predict_class_multiclass(
&self,
dataset: &BinnedDataset,
) -> Result<Vec<u32>>
pub fn predict_class_multiclass( &self, dataset: &BinnedDataset, ) -> Result<Vec<u32>>
Multi-class classification: predict class labels
Sourcepub fn predict_raw_multiclass(
&self,
dataset: &BinnedDataset,
) -> Result<Vec<Vec<f32>>>
pub fn predict_raw_multiclass( &self, dataset: &BinnedDataset, ) -> Result<Vec<Vec<f32>>>
Multi-class classification: predict raw logits
Sourcepub fn feature_importance(&self) -> Vec<f32>
pub fn feature_importance(&self) -> Vec<f32>
Get feature importance scores
Returns importance scores for each feature based on gain/split frequency. For PureTree/LinearThenTree, delegates to GBDTModel. For RandomForest, computes importance from split frequencies across all trees.
Sourcepub fn predict_raw(&self, features: &[f64]) -> Result<Vec<f32>>
pub fn predict_raw(&self, features: &[f64]) -> Result<Vec<f32>>
Predict from raw (unbinned) feature values
Useful when you have raw feature values and don’t want to create a BinnedDataset. Only supported in PureTree mode.
§Arguments
features- Raw feature values for one or more rows, flattened row-major
Sourcepub fn predict_raw_with_intervals(
&self,
features: &[f64],
) -> Result<(Vec<f32>, Vec<f32>, Vec<f32>)>
pub fn predict_raw_with_intervals( &self, features: &[f64], ) -> Result<(Vec<f32>, Vec<f32>, Vec<f32>)>
Predict with intervals from raw (unbinned) feature values
Sourcepub fn predict_proba_raw(&self, features: &[f64]) -> Result<Vec<f32>>
pub fn predict_proba_raw(&self, features: &[f64]) -> Result<Vec<f32>>
Binary classification probability from raw features
Sourcepub fn predict_class_raw(
&self,
features: &[f64],
threshold: f32,
) -> Result<Vec<u32>>
pub fn predict_class_raw( &self, features: &[f64], threshold: f32, ) -> Result<Vec<u32>>
Binary classification class from raw features
Sourcepub fn predict_proba_multiclass_raw(
&self,
features: &[f64],
) -> Result<Vec<Vec<f32>>>
pub fn predict_proba_multiclass_raw( &self, features: &[f64], ) -> Result<Vec<Vec<f32>>>
Multi-class probabilities from raw features
Sourcepub fn predict_class_multiclass_raw(&self, features: &[f64]) -> Result<Vec<u32>>
pub fn predict_class_multiclass_raw(&self, features: &[f64]) -> Result<Vec<u32>>
Multi-class class labels from raw features
Sourcepub fn predict_raw_multiclass_raw(
&self,
features: &[f64],
) -> Result<Vec<Vec<f32>>>
pub fn predict_raw_multiclass_raw( &self, features: &[f64], ) -> Result<Vec<Vec<f32>>>
Multi-class raw logits from raw features
Sourcepub fn update(
&mut self,
dataset: &BinnedDataset,
loss_fn: &dyn LossFunction,
additional_rounds: usize,
) -> Result<IncrementalUpdateReport>
pub fn update( &mut self, dataset: &BinnedDataset, loss_fn: &dyn LossFunction, additional_rounds: usize, ) -> Result<IncrementalUpdateReport>
Update the model with new training data (incremental learning)
This method continues training from the current model state:
- PureTree: Appends new trees trained on residuals
- LinearThenTree: Updates linear weights (warm_fit) + appends new trees
- RandomForest: Appends new bootstrap trees
§Arguments
dataset- New data to train on (must have same feature schema)loss_fn- Loss function (should match original training)additional_rounds- Number of new boosting rounds (trees) to add
§Example
// Train on January data
let model = UniversalModel::train(&jan_data, config, &MseLoss)?;
// Update with February data (10 more trees)
model.update(&feb_data, &MseLoss, 10)?;Sourcepub fn save_trb(&self, path: impl AsRef<Path>, description: &str) -> Result<()>
pub fn save_trb(&self, path: impl AsRef<Path>, description: &str) -> Result<()>
Save model to TRB (TreeBoost) incremental format
TRB format supports incremental updates without rewriting the entire file. Use this format when you plan to update the model with new data.
§Example
model.save_trb("model.trb", "Initial training on January data")?;
// Later, after updating:
model.save_trb_update("model.trb", 1000, "February update")?;Sourcepub fn save_trb_update(
&self,
path: impl AsRef<Path>,
rows_trained: usize,
description: &str,
) -> Result<()>
pub fn save_trb_update( &self, path: impl AsRef<Path>, rows_trained: usize, description: &str, ) -> Result<()>
Append an update to an existing TRB file
This appends a new segment without rewriting the base model.
The model must be loaded with load_trb() before calling this.
§Arguments
path- Path to existing .trb filerows_trained- Number of rows used in this updatedescription- Description of this update
Sourcepub fn is_compatible_for_update(&self, dataset: &BinnedDataset) -> bool
pub fn is_compatible_for_update(&self, dataset: &BinnedDataset) -> bool
Check if model is compatible with dataset for incremental update
Sourcepub fn gbdt_model_mut(&mut self) -> Option<&mut GBDTModel>
pub fn gbdt_model_mut(&mut self) -> Option<&mut GBDTModel>
Get mutable reference to underlying GBDTModel (for advanced manipulation)
Sourcepub fn linear_booster_mut(&mut self) -> Option<&mut LinearBooster>
pub fn linear_booster_mut(&mut self) -> Option<&mut LinearBooster>
Get mutable reference to linear booster (for advanced manipulation)
Trait Implementations§
Source§impl Archive for UniversalModelwhere
UniversalConfig: Archive,
Option<GBDTModel>: Archive,
Option<Vec<GBDTModel>>: Archive,
Option<Vec<f32>>: Archive,
Option<f32>: Archive,
Option<LinearBooster>: Archive,
Vec<Tree>: Archive,
f32: Archive,
usize: Archive,
Skip: ArchiveWith<Option<DatasetAnalysis>> + ArchiveWith<Option<Vec<f32>>> + ArchiveWith<Option<Vec<usize>>> + ArchiveWith<Option<usize>> + ArchiveWith<Option<FeatureExtractor>>,
impl Archive for UniversalModelwhere
UniversalConfig: Archive,
Option<GBDTModel>: Archive,
Option<Vec<GBDTModel>>: Archive,
Option<Vec<f32>>: Archive,
Option<f32>: Archive,
Option<LinearBooster>: Archive,
Vec<Tree>: Archive,
f32: Archive,
usize: Archive,
Skip: ArchiveWith<Option<DatasetAnalysis>> + ArchiveWith<Option<Vec<f32>>> + ArchiveWith<Option<Vec<usize>>> + ArchiveWith<Option<usize>> + ArchiveWith<Option<FeatureExtractor>>,
Source§type Resolver = UniversalModelResolver
type Resolver = UniversalModelResolver
Source§fn resolve(&self, resolver: Self::Resolver, out: Place<Self::Archived>)
fn resolve(&self, resolver: Self::Resolver, out: Place<Self::Archived>)
Source§const COPY_OPTIMIZATION: CopyOptimization<Self> = _
const COPY_OPTIMIZATION: CopyOptimization<Self> = _
serialize. Read moreSource§impl Clone for UniversalModel
impl Clone for UniversalModel
Source§fn clone(&self) -> UniversalModel
fn clone(&self) -> UniversalModel
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moreSource§impl Debug for UniversalModel
impl Debug for UniversalModel
Source§impl<'de> Deserialize<'de> for UniversalModel
impl<'de> Deserialize<'de> for UniversalModel
Source§fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
Source§impl<__D: Fallible + ?Sized> Deserialize<UniversalModel, __D> for Archived<UniversalModel>where
UniversalConfig: Archive,
<UniversalConfig as Archive>::Archived: Deserialize<UniversalConfig, __D>,
Option<GBDTModel>: Archive,
<Option<GBDTModel> as Archive>::Archived: Deserialize<Option<GBDTModel>, __D>,
Option<Vec<GBDTModel>>: Archive,
<Option<Vec<GBDTModel>> as Archive>::Archived: Deserialize<Option<Vec<GBDTModel>>, __D>,
Option<Vec<f32>>: Archive,
<Option<Vec<f32>> as Archive>::Archived: Deserialize<Option<Vec<f32>>, __D>,
Option<f32>: Archive,
<Option<f32> as Archive>::Archived: Deserialize<Option<f32>, __D>,
Option<LinearBooster>: Archive,
<Option<LinearBooster> as Archive>::Archived: Deserialize<Option<LinearBooster>, __D>,
Vec<Tree>: Archive,
<Vec<Tree> as Archive>::Archived: Deserialize<Vec<Tree>, __D>,
f32: Archive,
<f32 as Archive>::Archived: Deserialize<f32, __D>,
usize: Archive,
<usize as Archive>::Archived: Deserialize<usize, __D>,
Skip: ArchiveWith<Option<DatasetAnalysis>> + DeserializeWith<<Skip as ArchiveWith<Option<DatasetAnalysis>>>::Archived, Option<DatasetAnalysis>, __D> + ArchiveWith<Option<Vec<f32>>> + DeserializeWith<<Skip as ArchiveWith<Option<Vec<f32>>>>::Archived, Option<Vec<f32>>, __D> + ArchiveWith<Option<Vec<usize>>> + DeserializeWith<<Skip as ArchiveWith<Option<Vec<usize>>>>::Archived, Option<Vec<usize>>, __D> + ArchiveWith<Option<usize>> + DeserializeWith<<Skip as ArchiveWith<Option<usize>>>::Archived, Option<usize>, __D> + ArchiveWith<Option<FeatureExtractor>> + DeserializeWith<<Skip as ArchiveWith<Option<FeatureExtractor>>>::Archived, Option<FeatureExtractor>, __D>,
impl<__D: Fallible + ?Sized> Deserialize<UniversalModel, __D> for Archived<UniversalModel>where
UniversalConfig: Archive,
<UniversalConfig as Archive>::Archived: Deserialize<UniversalConfig, __D>,
Option<GBDTModel>: Archive,
<Option<GBDTModel> as Archive>::Archived: Deserialize<Option<GBDTModel>, __D>,
Option<Vec<GBDTModel>>: Archive,
<Option<Vec<GBDTModel>> as Archive>::Archived: Deserialize<Option<Vec<GBDTModel>>, __D>,
Option<Vec<f32>>: Archive,
<Option<Vec<f32>> as Archive>::Archived: Deserialize<Option<Vec<f32>>, __D>,
Option<f32>: Archive,
<Option<f32> as Archive>::Archived: Deserialize<Option<f32>, __D>,
Option<LinearBooster>: Archive,
<Option<LinearBooster> as Archive>::Archived: Deserialize<Option<LinearBooster>, __D>,
Vec<Tree>: Archive,
<Vec<Tree> as Archive>::Archived: Deserialize<Vec<Tree>, __D>,
f32: Archive,
<f32 as Archive>::Archived: Deserialize<f32, __D>,
usize: Archive,
<usize as Archive>::Archived: Deserialize<usize, __D>,
Skip: ArchiveWith<Option<DatasetAnalysis>> + DeserializeWith<<Skip as ArchiveWith<Option<DatasetAnalysis>>>::Archived, Option<DatasetAnalysis>, __D> + ArchiveWith<Option<Vec<f32>>> + DeserializeWith<<Skip as ArchiveWith<Option<Vec<f32>>>>::Archived, Option<Vec<f32>>, __D> + ArchiveWith<Option<Vec<usize>>> + DeserializeWith<<Skip as ArchiveWith<Option<Vec<usize>>>>::Archived, Option<Vec<usize>>, __D> + ArchiveWith<Option<usize>> + DeserializeWith<<Skip as ArchiveWith<Option<usize>>>::Archived, Option<usize>, __D> + ArchiveWith<Option<FeatureExtractor>> + DeserializeWith<<Skip as ArchiveWith<Option<FeatureExtractor>>>::Archived, Option<FeatureExtractor>, __D>,
Source§fn deserialize(
&self,
deserializer: &mut __D,
) -> Result<UniversalModel, <__D as Fallible>::Error>
fn deserialize( &self, deserializer: &mut __D, ) -> Result<UniversalModel, <__D as Fallible>::Error>
Source§impl<__S: Fallible + ?Sized> Serialize<__S> for UniversalModelwhere
UniversalConfig: Serialize<__S>,
Option<GBDTModel>: Serialize<__S>,
Option<Vec<GBDTModel>>: Serialize<__S>,
Option<Vec<f32>>: Serialize<__S>,
Option<f32>: Serialize<__S>,
Option<LinearBooster>: Serialize<__S>,
Vec<Tree>: Serialize<__S>,
f32: Serialize<__S>,
usize: Serialize<__S>,
Skip: SerializeWith<Option<DatasetAnalysis>, __S> + SerializeWith<Option<Vec<f32>>, __S> + SerializeWith<Option<Vec<usize>>, __S> + SerializeWith<Option<usize>, __S> + SerializeWith<Option<FeatureExtractor>, __S>,
impl<__S: Fallible + ?Sized> Serialize<__S> for UniversalModelwhere
UniversalConfig: Serialize<__S>,
Option<GBDTModel>: Serialize<__S>,
Option<Vec<GBDTModel>>: Serialize<__S>,
Option<Vec<f32>>: Serialize<__S>,
Option<f32>: Serialize<__S>,
Option<LinearBooster>: Serialize<__S>,
Vec<Tree>: Serialize<__S>,
f32: Serialize<__S>,
usize: Serialize<__S>,
Skip: SerializeWith<Option<DatasetAnalysis>, __S> + SerializeWith<Option<Vec<f32>>, __S> + SerializeWith<Option<Vec<usize>>, __S> + SerializeWith<Option<usize>, __S> + SerializeWith<Option<FeatureExtractor>, __S>,
Source§impl Serialize for UniversalModel
impl Serialize for UniversalModel
Source§impl TunableModel for UniversalModel
impl TunableModel for UniversalModel
Source§type Config = UniversalConfig
type Config = UniversalConfig
Source§fn train(dataset: &BinnedDataset, config: &Self::Config) -> Result<Self>
fn train(dataset: &BinnedDataset, config: &Self::Config) -> Result<Self>
Source§fn apply_params(config: &mut Self::Config, params: &HashMap<String, ParamValue>)
fn apply_params(config: &mut Self::Config, params: &HashMap<String, ParamValue>)
Source§fn valid_params() -> &'static [&'static str]
fn valid_params() -> &'static [&'static str]
Source§fn default_config() -> Self::Config
fn default_config() -> Self::Config
Source§fn get_learning_rate(config: &Self::Config) -> f32
fn get_learning_rate(config: &Self::Config) -> f32
Source§fn configure_validation(
config: &mut Self::Config,
validation_ratio: f32,
early_stopping_rounds: usize,
)
fn configure_validation( config: &mut Self::Config, validation_ratio: f32, early_stopping_rounds: usize, )
Source§fn set_num_rounds(config: &mut Self::Config, num_rounds: usize)
fn set_num_rounds(config: &mut Self::Config, num_rounds: usize)
Source§fn train_with_validation(
train_data: &BinnedDataset,
val_data: &BinnedDataset,
val_targets: &[f32],
config: &Self::Config,
) -> Result<Self>
fn train_with_validation( train_data: &BinnedDataset, val_data: &BinnedDataset, val_targets: &[f32], config: &Self::Config, ) -> Result<Self>
Source§fn is_gpu_config(config: &Self::Config) -> bool
fn is_gpu_config(config: &Self::Config) -> bool
Source§fn save_rkyv(&self, path: &Path) -> Result<()>
fn save_rkyv(&self, path: &Path) -> Result<()>
Source§fn save_bincode(&self, path: &Path) -> Result<()>
fn save_bincode(&self, path: &Path) -> Result<()>
Source§fn supports_conformal() -> bool
fn supports_conformal() -> bool
Auto Trait Implementations§
impl Freeze for UniversalModel
impl RefUnwindSafe for UniversalModel
impl Send for UniversalModel
impl Sync for UniversalModel
impl Unpin for UniversalModel
impl UnwindSafe for UniversalModel
Blanket Implementations§
Source§impl<T> ArchivePointee for T
impl<T> ArchivePointee for T
Source§type ArchivedMetadata = ()
type ArchivedMetadata = ()
Source§fn pointer_metadata(
_: &<T as ArchivePointee>::ArchivedMetadata,
) -> <T as Pointee>::Metadata
fn pointer_metadata( _: &<T as ArchivePointee>::ArchivedMetadata, ) -> <T as Pointee>::Metadata
Source§impl<T> ArchiveUnsized for Twhere
T: Archive,
impl<T> ArchiveUnsized for Twhere
T: Archive,
Source§type Archived = <T as Archive>::Archived
type Archived = <T as Archive>::Archived
Archive, it may be
unsized. Read moreSource§fn archived_metadata(
&self,
) -> <<T as ArchiveUnsized>::Archived as ArchivePointee>::ArchivedMetadata
fn archived_metadata( &self, ) -> <<T as ArchiveUnsized>::Archived as ArchivePointee>::ArchivedMetadata
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self> ⓘ
fn into_either(self, into_left: bool) -> Either<Self, Self> ⓘ
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self> ⓘ
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self> ⓘ
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Key for Twhere
T: Clone,
impl<T> Key for Twhere
T: Clone,
Source§impl<T> LayoutRaw for T
impl<T> LayoutRaw for T
Source§fn layout_raw(_: <T as Pointee>::Metadata) -> Result<Layout, LayoutError>
fn layout_raw(_: <T as Pointee>::Metadata) -> Result<Layout, LayoutError>
Source§impl<T, N1, N2> Niching<NichedOption<T, N1>> for N2
impl<T, N1, N2> Niching<NichedOption<T, N1>> for N2
Source§unsafe fn is_niched(niched: *const NichedOption<T, N1>) -> bool
unsafe fn is_niched(niched: *const NichedOption<T, N1>) -> bool
Source§fn resolve_niched(out: Place<NichedOption<T, N1>>)
fn resolve_niched(out: Place<NichedOption<T, N1>>)
out indicating that a T is niched.