| Crates.io | scirs2-optimize |
| lib.rs | scirs2-optimize |
| version | 0.1.0-beta.2 |
| created_at | 2025-04-12 13:32:23.358157+00 |
| updated_at | 2025-09-20 08:39:10.487+00 |
| description | Optimization module for SciRS2 (scirs2-optimize) |
| homepage | |
| repository | https://github.com/cool-japan/scirs |
| max_upload_size | |
| id | 1630922 |
| size | 4,114,083 |
scirs2-optimize is a production-ready optimization library providing comprehensive algorithms for unconstrained and constrained optimization, least-squares problems, root finding, and global optimization. It provides a high-performance Rust implementation of SciPy's optimization functionality with an ergonomic API, advanced features, and excellent performance.
Unconstrained Optimization
Constrained Optimization
Least Squares Optimization
Root Finding
Metaheuristic Algorithms
Bayesian Optimization
Multi-objective Optimization
High Performance Computing
Automatic Differentiation
Stochastic Optimization
Specialized Methods
Add the following to your Cargo.toml:
[dependencies]
scirs2-optimize = "0.1.0-beta.2"
For advanced features, enable optional feature flags:
[dependencies]
scirs2-optimize = { version = "0.1.0-beta.2", features = ["async"] }
use scirs2_optimize::prelude::*;
// Minimize the Rosenbrock function
fn rosenbrock(x: &[f64]) -> f64 {
let (a, b) = (1.0, 100.0);
(a - x[0]).powi(2) + b * (x[1] - x[0].powi(2)).powi(2)
}
fn main() -> Result<(), OptimizeError> {
let result = minimize(rosenbrock, &[0.0, 0.0], UnconstrainedMethod::BFGS, None)?;
println!("Minimum at: {:?} with value: {:.6}", result.x, result.fun);
Ok(())
}
use scirs2_optimize::prelude::*;
fn main() -> Result<(), OptimizeError> {
// Find global minimum using Differential Evolution
let bounds = vec![(-5.0, 5.0), (-5.0, 5.0)];
let result = differential_evolution(rosenbrock, &bounds, None)?;
println!("Global minimum: {:?}", result.x);
Ok(())
}
use scirs2_optimize::prelude::*;
use ndarray::Array1;
fn residual(params: &[f64], data: &[f64]) -> Array1<f64> {
// Linear regression residuals
let n = data.len() / 2;
let (x_vals, y_vals) = data.split_at(n);
Array1::from_iter((0..n).map(|i|
y_vals[i] - (params[0] + params[1] * x_vals[i])
))
}
fn main() -> Result<(), OptimizeError> {
let data = vec![0., 1., 2., 3., 4., 0.1, 0.9, 2.1, 2.9, 10.0]; // with outlier
let result = robust_least_squares(
residual, &[0.0, 0.0], HuberLoss::new(1.0), None, &data, None
)?;
println!("Robust fit: intercept={:.3}, slope={:.3}", result.x[0], result.x[1]);
Ok(())
}
use scirs2_optimize::prelude::*;
fn main() -> Result<(), OptimizeError> {
let space = Space::new(vec![
Parameter::Real { name: "x".to_string(), low: -5.0, high: 5.0 },
Parameter::Real { name: "y".to_string(), low: -5.0, high: 5.0 },
]);
let result = bayesian_optimization(rosenbrock, &space, None)?;
println!("Bayesian optimum: {:?}", result.x);
Ok(())
}
π Production Ready
β‘ High Performance
π§ Intelligent Defaults
π§ Comprehensive Toolkit
π Scientific Computing Focus
| Problem Type | Recommended Method | Use Case |
|---|---|---|
| Smooth unconstrained | BFGS, L-BFGS |
Fast convergence with gradients |
| Noisy/non-smooth | Nelder-Mead, Powell |
Derivative-free robust optimization |
| Large-scale | L-BFGS, CG |
Memory-efficient for high dimensions |
| Global minimum | DifferentialEvolution, BayesianOptimization |
Avoid local minima |
| With constraints | SLSQP, TrustConstr |
Handle complex constraint sets |
| Least squares | LevenbergMarquardt |
Nonlinear curve fitting |
| With outliers | HuberLoss, BisquareLoss |
Robust regression |
This project is dual-licensed under:
You can choose to use either license. See the LICENSE file for details.