[][src]Trait vikos::Cost

pub trait Cost<Truth, Target = f64> {
    fn outer_derivative(&self, prediction: &Target, truth: Truth) -> Target;
fn cost(&self, prediction: Target, truth: Truth) -> f64; }

Representing a cost function whose value is supposed be minimized by the training algorithm.

The cost function is a quantity that describes how deviations of the prediction from the true, observed target values should be penalized during the optimization of the prediction.

Algorithms like stochastic gradient descent use the gradient of the cost function. When calculating the gradient, it is important to apply the outer-derivative of the cost function to the prediction, with the inner-derivative of the model to the coefficient changes (chain-rule of calculus). This inner-derivative must be supplied as the argument derivative_of_model to Cost::gradient.

Implementations of this trait can be found in cost

Required methods

fn outer_derivative(&self, prediction: &Target, truth: Truth) -> Target

The outer derivative of the cost function with respect to the prediction.

fn cost(&self, prediction: Target, truth: Truth) -> f64

Value of the cost function.

Loading content...

Implementors

impl Cost<bool, f64> for MaxLikelihood
[src]

impl Cost<f64, f64> for LeastSquares
[src]

impl Cost<f64, f64> for LeastAbsoluteDeviation
[src]

impl Cost<f64, f64> for MaxLikelihood
[src]

impl<V> Cost<usize, V> for MaxLikelihood where
    V: Vector
[src]

Loading content...