Crates.io | ratel_bandit |
lib.rs | ratel_bandit |
version | 0.2.3 |
source | src |
created_at | 2020-07-25 16:01:44.805966 |
updated_at | 2024-09-10 20:51:06.60327 |
description | Rust Implementation of a Muti-armed Bandit Simulator |
homepage | |
repository | https://github.com/DanielMorton/ratel |
max_upload_size | |
id | 269516 |
size | 60,994 |
Rust Implementation of a Muti-armed Bandit Simulator
The simulator consists of two components, a Bandit consisting of multiple arms each of which dispenses rewards according to a probability distribution and an Agent who pulls the arms in an attempt to learn which arm has the highest average reward.
A multi-armed bandit consists of a set of arms, each of which, when pulled, gives a reward according to some probability distribution. Any number of arms is allowed. There are currently five sets of distributions available; Binomial, Gaussian, Exponential, Gamma, and LogNormal.. Within those confines, all choices of distribution parameters are valid.
The agent must determine, by some procedure, which bandit arm produces the highest average reward. There are currently three strategies implemented. The greedy algorithm always chooses the arm with the highest estimated average reward. The epsilon-greedy algorithm follows the greed algorithm most of the time, but chooses a random arm with some small probability. The optimistic algorithm chooses the arm whose estimate has the highest upper bound in some confidence range.
The Game module manages interactions between the Bandit and the Agent. The Agent pulls the Bandit's arms a certain number of times. The Game module records the wins and the rewards for each iteration.
To build the simulator simply run
cargo build --release
To run the simulator, write a main
module with the desired simulation code.
Then run
cargo run --release -- ${PARAMETERS}
The simulator is designed for maximum flexibility. For inspiration, or to see how I constructed experiments, see Ratel-Experiments.
This code is compatible with all versions of Rust from 1.32 to 1.75.