[Computer-go] Kas Cup - results and prizes

David Fotland fotland at smart-games.com
Sat Aug 11 00:46:19 PDT 2012


I'm happy with MFGO's scaling.  I'm running a scaling test now, 4 threads vs
8 threads, fixed 32K total playouts per move, 19x19, no pondering.  Ideally
the win rate should be 50%, since the total playouts are the same.  Has
anyone tried this kind of scaling experiment, and is willing to share
results?

David

> -----Original Message-----
> From: computer-go-bounces at dvandva.org [mailto:computer-go-
> bounces at dvandva.org] On Behalf Of Petr Baudis
> Sent: Friday, August 10, 2012 12:47 PM
> To: computer-go at dvandva.org
> Subject: Re: [Computer-go] Kas Cup - results and prizes
> 
> On Fri, Aug 10, 2012 at 09:26:31AM -0700, David Fotland wrote:
> > Because my current approach seems to work just as well (or maybe
> > better), and I haven't had time to code up a shared try and tune it up
> > to validate that assumption.  Chaslot's paper indicates perhaps that
> > not having a shared tree is stronger.  My guess is that they are about
> > the same, so it's not worth the effort to change.
> 
> In Pachi, having a shared tree makes all the difference when scaling up
> to more threads. See the graph (really awful one, sorry, it's old!) at
> 
> 	http://pachi.or.cz/root-vs-shared.png
> 
> If you have some information sharing near the root, I imagine it might
> be similar to Pachi's distributed engine performance (or just slightly
> better). But that is still far behind in scaling compared to the shared
> tree in our experience.
> 
> P.S.: There are two important things, virtual loss (not necessarily 1
> simulation but possibly more) and mainly lockless updates. The latter
> also means that sane code should be really easy to modify to use single
> shared tree instead of multiple trees.
> 
> 				Petr "Pasky" Baudis
> _______________________________________________
> Computer-go mailing list
> Computer-go at dvandva.org
> http://dvandva.org/cgi-bin/mailman/listinfo/computer-go




More information about the Computer-go mailing list