Trait vikos::Cost [] [src]

pub trait Cost<Truth> {
    fn outer_derivative(&self, prediction: f64, truth: Truth) -> f64;
    fn cost(&self, prediction: f64, truth: Truth) -> f64;

    fn gradient(&self, prediction: f64, truth: Truth, derivative_of_model: f64) -> f64 { ... }
}

Representing a cost function whose value is supposed be minimized by the training algorithm.

The cost function is a quantity that describes how deviations of the prediction from the true, observed target values should be penalized during the optimization of the prediction.

Algorithms like stochastic gradient descent use the gradient of the cost function. When calculating the gradient, it is important to apply the outer-derivative of the cost function to the prediction, with the inner-derivative of the model to the coefficient changes (chain-rule of calculus). This inner-derivative must be supplied as the argument derivative_of_model to Cost::gradient.

Implementations of this trait can be found in cost

Required Methods

The outer derivative of the cost function with respect to the prediction.

Value of the cost function.

Provided Methods

Value of the gradient of the cost function (i.e. the cost function derived by the n-th coefficent at x expressed in Error(x) and dY(x)/dx

This method is called by stochastic gradient descent (SGD)-based training algorithm in order to determine the delta of the coefficents

Implementors of this trait should implement Cost::outer_derivative and not overwrite this method.

Implementors