| Crates.io | wolfe_bfgs |
| lib.rs | wolfe_bfgs |
| version | 0.1.4 |
| created_at | 2025-07-02 20:14:11.575845+00 |
| updated_at | 2025-07-14 02:50:22.332313+00 |
| description | A dense BFGS optimization algorithm in Rust, with a Strong Wolfe line search and initial Hessian scaling. |
| homepage | |
| repository | https://github.com/SauersML/bfgs |
| max_upload_size | |
| id | 1735577 |
| size | 55,387 |
A pure Rust implementation of the dense BFGS optimization algorithm for unconstrained nonlinear problems. This library provides a powerful solver built upon the principles and best practices outlined in Nocedal & Wright's Numerical Optimization.
This work is a rewrite of the original bfgs crate by Paul Kernfeld.
s_k^T y_k > 0 holds, making the algorithm reliable even for challenging functions.n x n inverse Hessian approximation, suitable for small- to medium-scale optimization problems.First, add this to your Cargo.toml:
[dependencies]
wolfe_bfgs = "0.1.0"
Here is an example of minimizing the 2D Rosenbrock function, a classic benchmark for optimization algorithms.
use wolfe_bfgs::{Bfgs, BfgsSolution};
use ndarray::{array, Array1};
// 1. Define the objective function and its gradient.
// The function must return a tuple: (value, gradient).
let rosenbrock = |x: &Array1<f64>| -> (f64, Array1<f64>) {
let a = 1.0;
let b = 100.0;
let f = (a - x[0]).powi(2) + b * (x[1] - x[0].powi(2)).powi(2);
let g = array![
-2.0 * (a - x[0]) - 4.0 * b * (x[1] - x[0].powi(2)) * x[0],
2.0 * b * (x[1] - x[0].powi(2)),
];
(f, g)
};
// 2. Set the initial guess.
let x0 = array![-1.2, 1.0];
// 3. Configure and run the solver.
let solution = Bfgs::new(x0, rosenbrock)
.with_tolerance(1e-6)
.with_max_iterations(100)
.run()
.expect("BFGS failed to solve");
println!(
"Found minimum f({:.3}) = {:.4} in {} iterations.",
solution.final_point, solution.final_value, solution.iterations
);
// The known minimum is at [1.0, 1.0].
assert!((solution.final_point[0] - 1.0).abs() < 1e-5);
assert!((solution.final_point[1] - 1.0).abs() < 1e-5);
This crate implements the standard, dense BFGS algorithm. It is not a limited-memory (L-BFGS) implementation. The implementation closely follows the methods described in Numerical Optimization (2nd ed.) by Nocedal and Wright:
H_k is updated using the standard formula (Eq. 6.17).zoom phase (Algorithm 3.6) that uses cubic interpolation for efficient refinement.This library is tested against a suite of standard optimization benchmarks. The results are validated against scipy.optimize.minimize(method='BFGS') via a Python test harness.
To run the full test suite:
cargo test --release
To run the benchmarks:
cargo bench
This crate is a fork and rewrite of the original bfgs crate by Paul Kernfeld. The new version changes the API and replaces the original grid-search-based line search with an implementation based on the Strong Wolfe conditions.
Licensed under either of:
at your option.