Rust_Simple_DNN

Crates.ioRust_Simple_DNN
lib.rsRust_Simple_DNN
version0.1.5
sourcesrc
created_at2023-12-24 03:15:21.224323
updated_at2023-12-27 21:55:13.449353
descriptionA crate for making optimized modular neural networks in rust
homepage
repositoryhttps://github.com/TrevorBlythe/Rusty-DNN
max_upload_size
id1079479
size98,340
Trevor Blythe (TrevorBlythe)

documentation

README

Rust DNN

Create Modular Deep Neural Networks in Rust easy

In progress

If literally anyone stars this project I will add convolutional layers, more activations, and deconv layers. If this project get 20 stars I add everything

Installation

After running

cargo add Rust_Simple_DNN

Then you must put these in your rust code

use Rust_Simple_DNN::rdnn::layers::*;
use Rust_Simple_DNN::rdnn::*;

Current Implemented Layers

Think of layers as building blocks for a neural network. Different Layers process data in different ways. Its important to choose the right ones to fit your situation. (Ex: conv layers for image processing)

layers:

  • Fully connected Dense Layers
FC::new(inputSize, outputSize)

These are best when doing just straight raw data processing. Using these combined with activations, it is technically possible to make a mathematical model for anything you want. These layers have exponintial more computation when scaled up though.


  • Activations
Tanh::new(inputSize); //hyperbolic tangent
Relu::new(inputSize); //if activation > 0
Sig::new(inputSize); //sigmoid

Put these after FC,Conv,Deconv, or any dotproduct type layer to make the network nonlinear, or else the network will not work 99% of use cases.

Mini tutorial

This is how you make a neural network that looks like this
image-alt-text-check-github-to-see-image

Use this code to make it:

//FC layers are dense layers.
//Sig layers are sigmoid activation
let mut net = Net::new(
        vec![
            FC::new(3, 4), //input 3, output 4
            Sig::new(4), //sigmoid, input 4 output 4

            FC::new(4, 4),
            Sig::new(4), //sigmoid

            FC::new(4, 1),// input 4 output 1
            Sig::new(1), //sigmoid
        ],
        1, //batch size
        0.1, //learning rate
    );
    //"net" is the variable representing your entire network


This is how you propagate data through the network:
net.forward_data(&vec![1.0, 0.0, -69.0]); //returns the output vector

After propagating some data through, you can then also backpropagate some like this:

 net.backward_data(&vec![0.0]); //a vector of what you want the nn to output

The network will automatically store and apply the gradients, so to train the network, all you need to do is repeatedly forward and backpropagate your data

let mut x = 0;

    while x < 5000 {
        net.forward_data(&vec![1.0, 0.0, 0.0]);
        net.backward_data(&vec![1.0]);

        net.forward_data(&vec![1.0, 1.0, 0.0]);
        net.backward_data(&vec![0.0]);

        net.forward_data(&vec![0.0, 1.0, 0.0]);
        net.backward_data(&vec![1.0]);

        net.forward_data(&vec![0.0, 0.0, 0.0]);
        net.backward_data(&vec![0.0]);
        x += 1;
    }

//at this point its trained (although this dataset is pretty useless lol)

This is Pytorch if it wasn't needlessly complicated be like hahahaha

Commit count: 28

cargo fmt