Crates.io | Rust_Simple_DNN |
lib.rs | Rust_Simple_DNN |
version | 0.1.5 |
source | src |
created_at | 2023-12-24 03:15:21.224323 |
updated_at | 2023-12-27 21:55:13.449353 |
description | A crate for making optimized modular neural networks in rust |
homepage | |
repository | https://github.com/TrevorBlythe/Rusty-DNN |
max_upload_size | |
id | 1079479 |
size | 98,340 |
Create Modular Deep Neural Networks in Rust easy
If literally anyone stars this project I will add convolutional layers, more activations, and deconv layers. If this project get 20 stars I add everything
After running
cargo add Rust_Simple_DNN
Then you must put these in your rust code
use Rust_Simple_DNN::rdnn::layers::*;
use Rust_Simple_DNN::rdnn::*;
Think of layers as building blocks for a neural network. Different Layers process data in different ways. Its important to choose the right ones to fit your situation. (Ex: conv layers for image processing)
FC::new(inputSize, outputSize)
These are best when doing just straight raw data processing. Using these combined with activations, it is technically possible to make a mathematical model for anything you want. These layers have exponintial more computation when scaled up though.
Tanh::new(inputSize); //hyperbolic tangent
Relu::new(inputSize); //if activation > 0
Sig::new(inputSize); //sigmoid
Put these after FC,Conv,Deconv, or any dotproduct type layer to make the network nonlinear, or else the network will not work 99% of use cases.
This is how you make a neural network that looks like this
Use this code to make it:
//FC layers are dense layers.
//Sig layers are sigmoid activation
let mut net = Net::new(
vec![
FC::new(3, 4), //input 3, output 4
Sig::new(4), //sigmoid, input 4 output 4
FC::new(4, 4),
Sig::new(4), //sigmoid
FC::new(4, 1),// input 4 output 1
Sig::new(1), //sigmoid
],
1, //batch size
0.1, //learning rate
);
//"net" is the variable representing your entire network
net.forward_data(&vec![1.0, 0.0, -69.0]); //returns the output vector
After propagating some data through, you can then also backpropagate some like this:
net.backward_data(&vec![0.0]); //a vector of what you want the nn to output
The network will automatically store and apply the gradients, so to train the network, all you need to do is repeatedly forward and backpropagate your data
let mut x = 0;
while x < 5000 {
net.forward_data(&vec![1.0, 0.0, 0.0]);
net.backward_data(&vec![1.0]);
net.forward_data(&vec![1.0, 1.0, 0.0]);
net.backward_data(&vec![0.0]);
net.forward_data(&vec![0.0, 1.0, 0.0]);
net.backward_data(&vec![1.0]);
net.forward_data(&vec![0.0, 0.0, 0.0]);
net.backward_data(&vec![0.0]);
x += 1;
}
//at this point its trained (although this dataset is pretty useless lol)
This is Pytorch if it wasn't needlessly complicated be like hahahaha