[Computer-go] Monte-Carlo Simulation Balancing in Practice

David Fotland fotland at smart-games.com
Thu Sep 30 16:54:41 PDT 2010

Very interesting paper.  On 9x9, life and death is critical, since white can
usually win by living in two places.  On 19x19, wall making moves are more
important.  This might explain why the patterns trained from 9x9 are so
tactical and eye shape focused, and why they don’t work so well on 19x19.

I didn’t notice at first that the 19x19 trials were against gnugo 3.8 level
0.  Why not use level 10, as in the 9x9 testing.  Many Faces wins about 50%
vs 3.7 level 10, and I would expect the same of Erica.  Is Gnugo 3.8 so much
stronger than 3.7?


-----Original Message-----
From: computer-go-bounces at dvandva.org
[mailto:computer-go-bounces at dvandva.org] On Behalf Of Rémi Coulom
Sent: Thursday, September 30, 2010 6:05 AM
To: computer-go at dvandva.org
Subject: [Computer-go] Monte-Carlo Simulation Balancing in Practice


This is the CG'2010 paper Aja wrote with me.

Abstract: Simulation balancing is a new technique to tune parameters of a
playout policy for a Monte-Carlo game-playing program. So far, this
algorithm had only been tested in a very artificial setting: it was limited
to 5x5 and 6x6 Go, and required a stronger external program that served as a
supervisor. In this paper, the effectiveness of simulation balancing is
demonstrated in a more realistic setting. A state-of-the-art program, Erica,
learned an improved playout policy on the 9x9 board, without requiring any
external expert to provide position evaluations. Evaluations were collected
by letting the program analyze positions by itself. The previous version of
Erica learned pattern weights with the minorization-maximization algorithm.
Thanks to simulation balancing, its playing strength was improved from a
winning rate of 69% to 78% against Fuego 0.4.

You can download it from there:

Computer-go mailing list
Computer-go at dvandva.org

More information about the Computer-go mailing list