Crates.io | tinygrad |
lib.rs | tinygrad |
version | 0.1.0 |
source | src |
created_at | 2023-12-29 20:47:46.442738 |
updated_at | 2024-02-19 21:59:29.335466 |
description | You like pytorch? You like micrograd? You love tinygrad! โค๏ธ |
homepage | |
repository | https://github.com/wiseaidev/tinygrad |
max_upload_size | |
id | 1083626 |
size | 17,201 |
A Rust crate for building and training neural networks. tinygrad
provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.
Get started with the tinygrad
library by following these simple steps:
tinygrad
crate by adding the following line to your Cargo.toml
file:[dependencies]
tinygrad = "0.1.0"
Tensor
and ForwardBackward
traits to create and work with tensors:use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};
// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);
// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
ForwardBackward
trait:use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};
// Example operation: Dot product
struct Dot;
impl ForwardBackward for Dot {
fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
let input = &inputs[0];
let weight = &inputs[1];
input.dot(weight)
}
fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
// Implement backward pass
// ...
}
}
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};
fn main() {
let input = array![1.0, 2.0, 3.0];
let weight = array![4.0, 5.0, 6.0];
let input_tensor = Box::new(Tensor::new(input));
let weight_tensor = Box::new(Tensor::new(weight));
let dot_fn = Dot;
let mut ctx = Context::new();
let inputs = vec![
input_tensor.get_value(),
weight_tensor.get_value(),
];
let output = dot_fn.forward(&mut ctx, inputs);
println!("Dot product: {:?}", output);
let grad_output = array![1.0, 1.0, 1.0];
dot_fn.backward(&mut ctx, grad_output.view());
let grad_input = &input_tensor.grad.clone();
let grad_weight = &weight_tensor.grad.clone();
println!("Gradient for input: {:?}", grad_input);
println!("Gradient for weight: {:?}", grad_weight);
}
Run tests for the tinygrad
crate using:
cargo test
You can access the source code for the tinygrad
crate on GitHub.
Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.
Full documentation for tinygrad
is available on docs.rs.
This project is licensed under the MIT License.