Parametric
The parametric crate provides a bridge between complex, hierarchical data structures (like neural networks or simulation models) and optimization algorithms that operate on a flat vector of parameters.
// instead of raw slices and positional access...
// ...define a named, structured type.
The Problem
Many optimization and machine learning algorithms (like evolutionary strategies, gradient descent, etc.) are designed to work with a simple, flat Vec<f64> of parameters. However, the models we want to optimize often have a more complex, nested structure.
This creates a painful "impedance mismatch":
- Manual Flattening: You have to write tedious and error-prone boilerplate code to flatten your structured model parameters into a vector for the optimizer.
- Manual Injection: Inside the optimization loop, you must write the reverse logic to "inject" the flat vector of parameters back into your model's structure to evaluate its performance.
- Brittleness: Every time you change your model's structure (e.g., add a layer to a neural network), you have to meticulously update both the flattening and injection code.
The Solution
parametric solves this by using a derive macro to automate the mapping between your structured types and a flat representation. By defining your model with a generic parameter (e.g., struct Model<P>), the macro automatically generates the necessary logic for this conversion.
The core workflow is:
- Define a generic struct: Create your model structure (e.g.,
MLP<P>) using a generic typePfor the parameters. Add#[derive(Parametric)]. - Create a specification: Instantiate your model with a type that describes parameter properties, like search ranges (e.g.,
MLP<Range>). This defines the parameter space. - Extract and Map: Use
parametric::extract_map_defaultsto convert your specification (MLP<Range>) into two things:- A runnable model instance with concrete value types (
MLP<f64>). - A flat
Vecof the parameter specifications (Vec<Range>) that can be passed directly to an optimizer.
- A runnable model instance with concrete value types (
- Inject: Inside your objective function, use
parametric::inject_from_sliceto efficiently update the model instance with the flat parameter slice provided by the optimizer.
This approach eliminates boilerplate, reduces errors, and cleanly decouples the model's definition from its parameterization.
Usage Example: Training a Neural Network
Here's a minimal example of defining a Multi-Layer Perceptron (MLP), specifying its parameter search space, and training it with a differential evolution algorithm.
See examples/mlp.rs for the complete code.
use Parametric;
// 1. DEFINE GENERIC, PARAMETRIC STRUCTS
// The same generic struct is used for two purposes:
// 1. As `MLP<Range>` to define the parameter search space (the specification).
// 2. As `MLP<f64>` to create a runnable model instance.
// Business logic is implemented on the concrete type.
// Implementation detail for Layer<f64>
// Define a custom type to represent a parameter's search range.
;
// Make it compatible with the `parametric` crate.
impl_parametric_arg!;
// Dummy optimizer function for demonstration.
// See `examples/mlp.rs` for the complete example
License
Licensed under either of
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.