Progressive Strategies For Monte-Carlo Tree Search
AbstractMonte-Carlo Tree Search (MCTS) is a new best-first search guided by the results of Monte-Carlo simulations. In this article, we introduce two progressive strategies for MCTS, called progressive bias and progressive unpruning. They enable the use of relatively time-expensive heuristic knowledge without speed reduction. Progressive bias directs the search according to heuristic knowledge. Progressive unpruning first reduces the branching factor, and then increases it gradually again. Experiments assess that the two progressive strategies significantly improve the level of our Go program Mango. Moreover, we see that the combination of both strategies performs even better on larger board sizes.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoArticle provided by World Scientific Publishing Co. Pte. Ltd. in its journal New Mathematics and Natural Computation.
Volume (Year): 04 (2008)
Issue (Month): 03 ()
Contact details of provider:
Web page: http://www.worldscinet.com/nmnc/nmnc.shtml
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Sébastien Bubeck & Rémi Munos & Gilles Stoltz & Csaba Szepesvari, 2011. "X-Armed Bandits," Post-Print hal-00450235, HAL.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Tai Tone Lim).
If references are entirely missing, you can add them using this form.