logosq-optimizer

Crates.iologosq-optimizer
lib.rslogosq-optimizer
version0.1.0
created_at2026-01-20 20:21:25.781974+00
updated_at2026-01-20 20:21:25.781974+00
descriptionClassical optimizers for variational quantum algorithms
homepage
repositoryhttps://github.com/WeaveITMeta/LogosQ-Optimizer
max_upload_size
id2057484
size81,034
Weave (WeaveITMeta)

documentation

https://docs.rs/logosq-optimizer

README

LogosQ Optimizer

Classical optimization algorithms for variational quantum algorithms, providing stable and fast parameter optimization.

Features

  • Adam: Adaptive moment estimation with momentum
  • L-BFGS: Quasi-Newton method for smooth objectives
  • SPSA: Gradient-free stochastic approximation
  • Natural Gradient: Fisher information-aware optimization
  • GPU acceleration: Optional CUDA support

Quick Start

use logosq_optimizer::{Adam, Optimizer};

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let optimizer = Adam::new()
        .with_learning_rate(0.01)
        .with_beta1(0.9);
    
    let mut params = vec![0.1; 16];
    let gradients = vec![0.01; 16];
    
    optimizer.step(&mut params, &gradients, 0)?;
    Ok(())
}

Installation

[dependencies]
logosq-optimizer = "0.1"

License

MIT OR Apache-2.0

Commit count: 1

cargo fmt