pub enum Loss {
MSE,
SoftmaxAndCrossEntropy,
BinaryCrossEntropy,
}Variants§
MSE
Mean Squared Error,
defined as 0.5*(y_true - y_pred)^2,
with the derivative being (y_pred - y_true).
This is the textbook definition, but is different to keras and other frameworks
which has derivative as 2(y_pred - y_true).
SoftmaxAndCrossEntropy
Will apply Softmax, then compute multi-class cross-entropy
with the derivative wrt to logits being y_pred - y_true
BinaryCrossEntropy
Will apply Sigmoid, then compute binary cross-entropy
with the derivative wrt to logits being y_pred - y_true
Implementations§
Trait Implementations§
impl Copy for Loss
impl Eq for Loss
impl StructuralPartialEq for Loss
Auto Trait Implementations§
impl Freeze for Loss
impl RefUnwindSafe for Loss
impl Send for Loss
impl Sync for Loss
impl Unpin for Loss
impl UnwindSafe for Loss
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more