Crate tinygrad

source ·
Expand description

§tinygrad

tinygrad is a crate for building and training neural networks in Rust. It provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.

§Quick Start

Get started with the tinygrad library by following these simple steps:

  1. Install the tinygrad crate by adding the following line to your Cargo.toml file:
[dependencies]
tinygrad = "0.1.0"
  1. Use the Tensor and ForwardBackward traits to create and work with tensors:
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);

// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
  1. Implement custom operations by defining structs that implement the ForwardBackward trait:
use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};

// Example operation: Dot product
struct Dot;

impl ForwardBackward for Dot {
    fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
        let input = &inputs[0];
        let weight = &inputs[1];
        input.dot(weight)
    }

    fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
        // Implement backward pass
        // ...
    }
}

§GitHub Repository

You can access the source code for the tinygrad crate on GitHub.

§Contributing

Contributions and feedback are welcome! If you’d like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.

Structs§

  • Represents the computation context, storing tensors for backward pass computations.
  • Represents a basic implementation of a tensor.

Traits§

  • This trait defines the interface for operations that have both forward and backward passes.
  • This trait defines the common interface for tensors in a computational graph.