snail_nn

Crates.iosnail_nn
lib.rssnail_nn
version0.1.0
sourcesrc
created_at2023-08-01 12:33:31.345347
updated_at2023-08-01 12:33:31.345347
descriptionsmall neural network libary, running on the cpu with parallelized stochastic gradient descent
homepage
repositoryhttps://github.com/Lommix/snail_nn
max_upload_size
id931838
size455,454
Lorenz (Lommix)

documentation

README

[WIP] Snail NN - smol neural network library

fully functional neural network libary with backpropagation and parallelized stochastic gradient descent implementation.

Examples

Storing images inside the neural network, upscaling and interpolate between them.

cargo run --example imagepol --release

image


The mandatory xor example

cargo run --example xor --release

image


Example Code:

use snail_nn::prelude::*;

fn main(){
    let mut nn = Model::new(&[2, 3, 1]);
    nn.set_activation(Activation::Sigmoid)

    let mut batch = TrainingBatch::empty(2, 1);
    let rate = 1.0;

    // AND - training data
    batch.add(&[0.0, 0.0], &[0.0]);
    batch.add(&[1.0, 0.0], &[0.0]);
    batch.add(&[0.0, 1.0], &[0.0]);
    batch.add(&[1.0, 1.0], &[1.0]);

    for _ in 0..10000 {
        let (w_gradient, b_gradient) = nn.gradient(&batch.random_chunk(2));
        nn.learn(w_gradient, b_gradient, rate);
    }

    println!("ouput {:?} expected: 0.0", nn.forward(&[0.0, 0.0]));
    println!("ouput {:?} expected: 0.0", nn.forward(&[1.0, 0.0]));
    println!("ouput {:?} expected: 0.0", nn.forward(&[0.0, 1.0]));
    println!("ouput {:?} expected: 1.0", nn.forward(&[1.0, 1.0]));
}

Features

  • Sigmoid, Tanh & Relu activation functions

  • Parallelized stochastic gradient descent

  • It works on my machine ¯\(ツ)

  • Will gobble up most of your cpu

Todo

  • more examples
  • better documentation
  • compute shaders with wgpu
Commit count: 20

cargo fmt