Crates.io | renplex |
lib.rs | renplex |
version | 0.1.1 |
source | src |
created_at | 2023-09-24 19:22:05.586754 |
updated_at | 2024-06-30 16:22:31.918824 |
description | A library capable of modeling complexed-valued neural networks built with Rust. |
homepage | https://github.com/Pxdr0-A/renplex.git |
repository | https://github.com/Pxdr0-A/renplex.git |
max_upload_size | |
id | 982141 |
size | 201,735 |
A library built with Rust capable of modeling complex-valued neural networks.
It is still in early stages of development but can already be used to build and train architectures based on Multi-Layer Perceptron and Convolutional Neural Networks.
Add the library to your Cargo.toml
on your rust project:
[dependencies]
renplex = "0.1.0"
Initiate a trainable layer by indicating precision of the calculations, number of input features and output features, initialization method and activation function.
use renplex::math::Complex;
use renplex::math::cfloat::Cf32;
use renplex::input::IOShape;
use renplex::init::InitMethod;
use renplex::act::ComplexActFunc;
use renplex::cvnn::layer::dense::DenseCLayer;
let ref mut seed: &mut u128 = 63478262957;
// define complex number with 64bits (precision)
// 32 bits for each real and imaginary part
type Precision = Cf32;
// number of scalar input features
let ni = 64;
// number of scalar output features (number of units)
let no = 16;
// input features are scalars (vetor of values)
// in the case of a 2D conv is input features are matrices (vector of matrices)
let input_shape = IOShape::Scalar(ni);
// initialization method
let init_method = InitMethod::XavierGlorotU(ni + no);
// complex activation function
let act_func = ComplexActFunc::RITSigmoid;
let dense_layer: DenseCLayer<Precision> = DenseCLayer::init(
input_shape,
no,
act_func,
init_method,
seed
).unwrap();
Add layers to a feed forward network struct by defining an input layer and subsquent hidden layers.
use renplex::math::Complex;
use renplex::math::cfloat::Cf32;
use renplex::opt::ComplexLossFunc;
use renplex::cvnn::layer::CLayer;
use renplex::cvnn::network::CNetwork;
let mut network: CNetwork<Cf32> = CNetwork::new();
// layers need to be wrapped for a common CLayer<T> interface
network.add_input(dense_input_layer.wrap()).unwrap();
network.add(dense_layer.wrap()).unwrap();
Renplex provides a simple dataset interface for building a batch of data with independent and dependent variable.
use renplex::math::Complex;
use renplex::math::cfloat::Cf32;
use renplex::dataset::Dataset;
// independent variable type
type XFeatures = Cf32;
// dependent variable type
type YFeatures = Cf32;
// initialize a batch of data
let mut data_batch: Dataset<XFeatures, YFeatures> = Dataset::new();
// extract a unique batch of data points
// can be done in any logic (default order, randomized, ...)
for _ in 0..batch_size {
// collect data points from a file
let x = ...;
let y = ...;
let data_point = (x, y);
// add point to the dataset
data_batch.add_point(data_point);
}
Calculate performance metrics train a CVNN with the fully complex back-propagation algorithm.
use renplex::math::Complex;
use renplex::math::cfloat::Cf32;
use renplex::opt::ComplexLossFunction;
// define loss function
let loss_func = ComplexLossFuntion::Conventional;
// history of loss function values
let loss_vals = Vec::new();
// define a learning rate
let learning_rate = Cf32::new(1.0, 0.0);
// calculate the initial loss for the batch of data
let loss = network.loss(
data_batch,
&loss_func
).unwrap()
// add loss value to history
// (for optimization algorithms for instance)
loss_vals.push(loss);
// train 1 batch of data
network.gradient_opt(
data_batch,
&loss_func,
learning_rate
).unwrap();
// this pipeline can be repeated to perform an epoch
// and repeated again for as many epochs choosen
Forward signals in a CVNN and inspect on intermediate features.
use renplex::input::IOType;
let input_point = IOType::Scalar(vec![0.22, 0.17, 0.13]);
// output of the networks
let prediction = network
.foward(input_point)
.unwrap();
// output features of the second layer of the network
let features = network
.intercept(input_point, 2)
.unwrap();
In the repository, there is an examples folder with a classification.rs
and regression.rs
files that run each respective pipeline however, classification.rs
requires the MNIST dataset on the root of the project. To run an example code, use the following command after the project is cloned (at the root of the project):
cargo run --example <example>
One can also download each file individually from GitHub and run it inside a project with Renplex as a dependency.