Crate ksnn

source · []
Expand description

ksnn

ksnn, or Kosiorek’s Simple Neural Networks, is a crate that simplifies the creation, training, and validation of a neural network. The crate is heavily inspired by “Neural Networks from Scratch in Python” by Harrison Kinsley & Daniel Kukieła.

Crate TODO’s

  • Improving crate efficiency, likely through multithreading or moving calculations from the CPU to the GPU
  • Addition of more network types, such as regression networks
  • Addition of more activation and entropy functions

Examples

How to create a ClassificationNetwork:

// Step 1: Get Training data and the anwsers for that data formatted in a 2d array.
let x_train = ndarray::arr2(&[
    [0.7, 0.29, 1.0, 0.55, 0.33, 0.27],
    [0.01, 0.08, 0.893, 0.14, 0.19, 0.98]
]);
 
let y_train = ndarray::arr2(&[
    [0, 0, 1],
    [0, 1, 0]
]);
 
// Step 2: Get testing data and the anwsers for that data formatted in a 2d array.
let x_test = ndarray::arr2(&[
    [0.64, 0.456, 0.68, 0.1, 0.123, 0.32],
    [0.78, 0.56, 0.58, 0.12, 0.37, 0.46]
]);
 
let y_test = ndarray::arr2(&[
    [1, 0, 0],
    [0, 1, 0]
]);
 
// Step 3: Create the network.
let mut neural_network = ksnn::ClassificationNetwork::new(
    vec!["ActivationReLU", "ActivationReLU", "ActivationReLU", "SoftmaxLossCC"],
    vec![32, 64, 48, 3],
    ksnn::enable_dropout_layers(true),
    ksnn::optimizers::optimizer_adam_def(),
    &x_train,
);

// Step 4: Adjust dropout layers, if enabled.
neural_network.dropout_layers[0].rate = 0.8;
neural_network.dropout_layers[1].rate = 0.75;
neural_network.dropout_layers[2].rate = 0.9;
 
// Step 5: Adjust weight regularizers as desired.
neural_network.dense_layers[0].weight_regularizer_l2 = 5e-4;
neural_network.dense_layers[0].bias_regularizer_l2 = 5e-4;
 
neural_network.dense_layers[1].weight_regularizer_l2 = 5e-3;
neural_network.dense_layers[1].bias_regularizer_l2 = 5e-3;
 
neural_network.dense_layers[2].weight_regularizer_l2 = 5e-5;
neural_network.dense_layers[2].bias_regularizer_l2 = 5e-5;
 
// Step 6: Fit, or train, the network on the training data.
neural_network.fit(100, 1, x_train, y_train);
// Step 6: Test your trained network on data it hasn't seen before to see how well it 
// deals with new information.
neural_network.validate(x_test, y_test);
// Step 7: Save your network to a file to be loaded and used later.
neural_network.save("my_network.json");

How to load a ClassificationNetwork:

let mut neural_network = ksnn::ClassificationNetwork::load("my_network.json");

Modules

activation_functions

conversion_functions

network_layers

optimizers

Structs

Stores all information needed for the training and validating of a classification focused network. The network has parameters that can be manually set such as an individual layers bias and weight l1/l2 regularizer, an individual layers dropout rate if dropout layers have been enabled, and if the training progress should be displayed as a progress bar or if the loss and accuaracy should be printed to the terminal.

Functions

Takes in a bool value and outputs that same value.

Takes in a reference to a one-hot encoded 2d array that contains answer data, often marked as some form of y, and returns the number of classes found in that array.