tinygrad 0.1.0

You like pytorch? You like micrograd? You love tinygrad! ❤️
Documentation
  • Coverage
  • 82.35%
    14 out of 17 items documented4 out of 14 items with examples
  • Size
  • Source code size: 18.5 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 1.96 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 24s Average build duration of successful builds.
  • all releases: 24s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • wiseaidev/tinygrad
    6 0 0
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • wiseaidev

✨️ tinygrad

Crates.io docs License

A Rust crate for building and training neural networks. tinygrad provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.

🚀 Quick Start

Get started with the tinygrad library by following these simple steps:

  1. Install the tinygrad crate by adding the following line to your Cargo.toml file:
[dependencies]
tinygrad = "0.1.0"
  1. Use the Tensor and ForwardBackward traits to create and work with tensors:
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);

// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
  1. Implement custom operations by defining structs that implement the ForwardBackward trait:
use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};

// Example operation: Dot product
struct Dot;

impl ForwardBackward for Dot {
    fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
        let input = &inputs[0];
        let weight = &inputs[1];
        input.dot(weight)
    }

    fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
        // Implement backward pass
        // ...
    }
}

🔧 Usage Example

use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

fn main() {
    let input = array![1.0, 2.0, 3.0];
    let weight = array![4.0, 5.0, 6.0];

    let input_tensor = Box::new(Tensor::new(input));
    let weight_tensor = Box::new(Tensor::new(weight));

    let dot_fn = Dot;
    let mut ctx = Context::new();

    let inputs = vec![
        input_tensor.get_value(),
        weight_tensor.get_value(),
    ];
    let output = dot_fn.forward(&mut ctx, inputs);

    println!("Dot product: {:?}", output);

    let grad_output = array![1.0, 1.0, 1.0];
    dot_fn.backward(&mut ctx, grad_output.view());

    let grad_input = &input_tensor.grad.clone();
    let grad_weight = &weight_tensor.grad.clone();

    println!("Gradient for input: {:?}", grad_input);
    println!("Gradient for weight: {:?}", grad_weight);
}

🧪 Testing

Run tests for the tinygrad crate using:

cargo test

🌐 GitHub Repository

You can access the source code for the tinygrad crate on GitHub.

🤝 Contributing

Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.

📘 Documentation

Full documentation for tinygrad is available on docs.rs.

📄 License

This project is licensed under the MIT License.