[Computer-go] CLOP: Confident Local Optimization forNoisyBlack-Box Parameter Tuning

Brian Sheppard sheppardco at aol.com
Sat Sep 10 09:47:36 PDT 2011


Yes, that makes sense. You don't want Gaussian there. 

-----Original Message-----
From: computer-go-bounces at dvandva.org
[mailto:computer-go-bounces at dvandva.org] On Behalf Of Rémi Coulom
Sent: Saturday, September 10, 2011 11:36 AM
To: computer-go at dvandva.org
Subject: Re: [Computer-go] CLOP: Confident Local Optimization
forNoisyBlack-Box Parameter Tuning


On 10 sept. 2011, at 17:20, Brian Sheppard wrote:

> I am going through the paper, and there is a point where I do not 
> understand.
> 
> When the weights are recalculated in Algorithm 1, the expression for 
> wk is
> exp((qk(x) - mk) / H * sk).
> 
> Should the formula have a square? That is, exp((qk(x) - mk) * (qk(x) - 
> mk) / H * sk)?
> 
> Thanks,
> Brian

No. The idea is that the weight of a sample should be low when it is far
below the mean, not when it is far from the mean. That is to say, samples
whose value is very low according to the regression get a low weight. But
samples whose strength is estimated to be above average keep a full weight
of 1 (because of the "min", the weight can never get above 1).

Note BTW that since my previous message I updated the web site of CLOP with
some data, screenshots, and a link to the computer-chess forum with more
discussions about the algorithm:
http://remi.coulom.free.fr/CLOP/

Rémi
_______________________________________________
Computer-go mailing list
Computer-go at dvandva.org
http://dvandva.org/cgi-bin/mailman/listinfo/computer-go




More information about the Computer-go mailing list