mcts
An incrediblely easy-to-use library for Monte Carlo Tree Search.
All you need to do is to implement traits mcts::GameState and mcts::Action and mark traits mcts::EndStatus and mcts::Action for corresponding types in your game.
Usage
Add the dependency to your Cargo.toml
cargo add mcts
To use the library, you should implement traits mcts::GameState and mcts::Action and mark traits mcts::EndStatus and mcts::Action for corresponding types in your game.
pub trait EndStatus {}
pub trait Action: Eq + Clone {}
pub trait Player<E: EndStatus> {
fn reward_when_outcome_is(&self, outcome: &E) -> f32;
}
pub trait GameState<P, E, A>
where
P: Player<E>,
E: EndStatus,
A: Action,
{
fn player(&self) -> P;
fn end_status(&self) -> Option<E>;
fn possible_actions(&self) -> Vec<A>;
fn act(&self, action: &A) -> Self;
}
Here is an example for tic tac toe. The implemention of the mod game is hiden. Here to see the full example.
use game::{EndStatus, Player, Action, TictactoeGame};
impl mcts::EndStatus for EndStatus {}
impl mcts::Action for Action {}
impl mcts::Player<EndStatus> for Player {
fn reward_when_outcome_is(&self, outcome: &EndStatus) -> f32 {
match outcome {
EndStatus::Win(winner) => {
if self == winner {
1.
} else {
0.
}
}
EndStatus::Tie => 0.5,
}
}
}
impl mcts::GameState<Player, EndStatus, Action> for TictactoeGame {
fn end_status(&self) -> Option<EndStatus> {
self.end_status
}
fn player(&self) -> Player {
return self.player;
}
fn possible_actions(&self) -> Vec<Action> {
let mut possible_actions = Vec::new();
for row in 0..3 {
for col in 0..3 {
if !self.occupied(Action(row, col)) {
possible_actions.push(Action(row, col));
}
}
}
possible_actions
}
fn act(&self, selection: &Action) -> Self {
self.place(&selection).unwrap()
}
}
Then you can build a mcts::SearchTree and start to search. To make search records reused in the next move, using mcts::SearchTree::renew(&mut self, selected: &A) to step forward.
fn main() {
let mut game = Rc::new(TictactoeGame::new());
let mut search_tree = mcts::SearchTree::new(game.clone());
while game.end_status.is_none() {
let selected = search_tree.search(1000).unwrap();
search_tree.renew(&selected).unwrap();
game = search_tree.get_game_state();
game.draw_board();
}
}