pub enum LossToModelOutputsDerivativesComputationError {
NotInitialized,
NoCommandQueue,
OpenCL(ClError),
SumOutputsPerSmaple(ReduceOutputsPerSampleError),
OutputsAndExpectedOutputsDoNotMatch,
TrainingDataDoesNotHaveExpectedSamplesAmount,
KernelNotFound(KernelNotFoundError),
ProgramNotFound(ProgramNotFoundError),
BufferOperation(BufferOperationError),
}Expand description
An enum containing all of the possible errors that can happen when trying to compute the derivatives of the loss of a Model with respect to its outputs to do gradient descent on it.
Variants§
NotInitialized
Happens when the LossFunction trait object was not initialized.
NoCommandQueue
Happens when there is no command queue in the OpenCLState.
OpenCL(ClError)
Happens when something goes wrong with OpenCL.
SumOutputsPerSmaple(ReduceOutputsPerSampleError)
Happens when something goes wrong whle trying to sum the outputs for each sample separetly (used in the Categorical Cross Entropy loss fn)
OutputsAndExpectedOutputsDoNotMatch
Happens when the expected outputs and the actual outputs do not match in size.
TrainingDataDoesNotHaveExpectedSamplesAmount
Happens when the given training data does not have the amount of samples specified inside of it.
KernelNotFound(KernelNotFoundError)
Happens when a required kernel was not found
ProgramNotFound(ProgramNotFoundError)
Happens when a required program was not found
BufferOperation(BufferOperationError)
Happens when a buffer operation goes wrong