tinygrad

Crates.iotinygrad
lib.rstinygrad
version0.1.0
sourcesrc
created_at2023-12-29 20:47:46.442738
updated_at2024-02-19 21:59:29.335466
descriptionYou like pytorch? You like micrograd? You love tinygrad! โค๏ธ
homepage
repositoryhttps://github.com/wiseaidev/tinygrad
max_upload_size
id1083626
size17,201
Mahmoud (wiseaidev)

documentation

https://docs.rs/tinygrad

README

โœจ๏ธ tinygrad

Crates.io docs License

A Rust crate for building and training neural networks. tinygrad provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.

๐Ÿš€ Quick Start

Get started with the tinygrad library by following these simple steps:

  1. Install the tinygrad crate by adding the following line to your Cargo.toml file:
[dependencies]
tinygrad = "0.1.0"
  1. Use the Tensor and ForwardBackward traits to create and work with tensors:
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);

// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
  1. Implement custom operations by defining structs that implement the ForwardBackward trait:
use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};

// Example operation: Dot product
struct Dot;

impl ForwardBackward for Dot {
    fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
        let input = &inputs[0];
        let weight = &inputs[1];
        input.dot(weight)
    }

    fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
        // Implement backward pass
        // ...
    }
}

๐Ÿ”ง Usage Example

use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

fn main() {
    let input = array![1.0, 2.0, 3.0];
    let weight = array![4.0, 5.0, 6.0];

    let input_tensor = Box::new(Tensor::new(input));
    let weight_tensor = Box::new(Tensor::new(weight));

    let dot_fn = Dot;
    let mut ctx = Context::new();

    let inputs = vec![
        input_tensor.get_value(),
        weight_tensor.get_value(),
    ];
    let output = dot_fn.forward(&mut ctx, inputs);

    println!("Dot product: {:?}", output);

    let grad_output = array![1.0, 1.0, 1.0];
    dot_fn.backward(&mut ctx, grad_output.view());

    let grad_input = &input_tensor.grad.clone();
    let grad_weight = &weight_tensor.grad.clone();

    println!("Gradient for input: {:?}", grad_input);
    println!("Gradient for weight: {:?}", grad_weight);
}

๐Ÿงช Testing

Run tests for the tinygrad crate using:

cargo test

๐ŸŒ GitHub Repository

You can access the source code for the tinygrad crate on GitHub.

๐Ÿค Contributing

Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.

๐Ÿ“˜ Documentation

Full documentation for tinygrad is available on docs.rs.

๐Ÿ“„ License

This project is licensed under the MIT License.

Commit count: 0

cargo fmt