| Crates.io | globalsearch |
| lib.rs | globalsearch |
| version | 0.2.0 |
| created_at | 2025-02-12 18:21:47.039575+00 |
| updated_at | 2025-02-22 15:15:07.206351+00 |
| description | Global optimization with scatter search and local NLP solvers written in Rust using argmin |
| homepage | |
| repository | https://github.com/GermanHeim/globalsearch-rs |
| max_upload_size | |
| id | 1553254 |
| size | 164,540 |
Global optimization with scatter search and local NLP solvers written in Rust
globalsearch-rs: Rust implementation of the OQNLP (OptQuest/NLP) algorithm with the core ideas from "Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization" by Ugray et al. (2007). Combines scatter search metaheuristics with local minimization for global optimization of nonlinear problems.
Similar to MATLAB's GlobalSearch [2], using argmin, rayon and ndarray.
π― Multistart heuristic framework for global optimization
π¦ Local optimization using the argmin crate [3]
π Parallel execution of initial stage using Rayon
Add this to your Cargo.toml:
[dependencies]
globalsearch = "0.2.0"
Or use cargo add globalsearch in your project directory.
Install Rust toolchain using rustup.
Clone repository:
git clone https://github.com/GermanHeim/globalsearch-rs.git
cd globalsearch-rs
Build the project:
cargo build --release
Define a problem by implementing the Problem trait.
use ndarray::{array, Array1, Array2};
use globalsearch::problem::Problem;
use globalsearch::types::EvaluationError;
pub struct MinimizeProblem;
impl Problem for MinimizeProblem {
fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError> {
Ok(
..., // Your objective function here
)
}
fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError> {
Ok(array![
..., // Optional: Gradient of your objective function here
])
}
fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError> {
Ok(array![
..., // Optional: Hessian of your objective function here
])
}
fn variable_bounds(&self) -> Array2<f64> {
array![[..., ...], [..., ...]] // Lower and upper bounds for each variable
}
}
Where the Problem trait is defined as:
pub trait Problem {
fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError>;
fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError>;
fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError>;
fn variable_bounds(&self) -> Array2<f64>;
}
Depending on your choice of local solver, you might need to implement the gradient and hessian methods. Learn more about the local solver configuration in the argmin docs or the LocalSolverType.
π΄ Note: Variable bounds are only used in the scatter search phase of the algorithm. The local solver is unconstrained (See argmin issue #137) and therefor can return solutions out of bounds.
Set OQNLP parameters
use globalsearch::types::{LocalSolverType, OQNLPParams};
use globalsearch::local_solver::builders::SteepestDescentBuilder;
let params: OQNLPParams = OQNLPParams {
iterations: 125,
wait_cycle: 10,
threshold_factor: 0.2,
distance_factor: 0.75,
population_size: 250,
local_solver_type: LocalSolverType::SteepestDescent,
local_solver_config: SteepestDescentBuilder::default().build(),
seed: 0,
};
Where OQNLPParams is defined as:
pub struct OQNLPParams {
pub iterations: usize,
pub wait_cycle: usize,
pub threshold_factor: f64,
pub distance_factor: f64,
pub population_size: usize,
pub local_solver_type: LocalSolverType,
pub local_solver_config: LocalSolverConfig,
pub seed: u64,
}
And LocalSolverType is defined as:
pub enum LocalSolverType {
LBFGS,
NelderMead,
SteepestDescent,
TrustRegion,
}
You can also modify the local solver configuration for each type of local solver. See builders.rs for more details.
Run the optimizer
use oqnlp::{OQNLP, OQNLPParams};
use types::{SolutionSet}
fn main() -> Result<(), Box<dyn std::error::Error>> {
let problem = MinimizeProblem;
let params: OQNLPParams = OQNLPParams {
iterations: 125,
wait_cycle: 10,
threshold_factor: 0.2,
distance_factor: 0.75,
population_size: 250,
local_solver_type: LocalSolverType::SteepestDescent,
local_solver_config: SteepestDescentBuilder::default().build(),
seed: 0,
};
let mut optimizer: OQNLP<MinimizeProblem> = OQNLP::new(problem, params)?;
// OQNLP returns a solution set with the best solutions found
let solution_set: SolutionSet = optimizer.run()?;
println!("{}", solution_set)
Ok(())
}
src/
βββ lib.rs # Module declarations
βββ oqnlp.rs # Core OQNLP algorithm implementation
βββ scatter_search.rs # Scatter search component
βββ local_solver/
β βββ builders.rs # Local solver configuration builders
β βββ runner.rs # Local solver runner
βββ filters.rs # Merit and distance filtering logic
βββ problem.rs # Problem trait
βββ types.rs # Data structures and parameters
Distributed under the MIT License. See LICENSE.txt for more information.
[1] Zsolt Ugray, Leon Lasdon, John Plummer, Fred Glover, James Kelly, Rafael MartΓ, (2007) Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization. INFORMS Journal on Computing 19(3):328-340. http://dx.doi.org/10.1287/ijoc.1060.0175
[2] GlobalSearch. The MathWorks, Inc. Available at: https://www.mathworks.com/help/gads/globalsearch.html (Accessed: 27 January 2025)
[3] Kroboth, S. argmin{}. Available at: https://argmin-rs.org/ (Accessed: 25 January 2025)