[Computer-go] New version of CLOP

Rémi Coulom Remi.Coulom at free.fr
Mon Nov 7 06:37:01 PST 2011


On 7 nov. 2011, at 14:07, Brian Sheppard wrote:

> I have also seen runs where many localization iterations are necessary, but
> I did not connect that to overfitting.
> 
> I do see Clop runs that "wander" for an excessive number of trials. That has
> troubled me, because I would like to use Clop for high dimensional tuning,
> but I cannot afford to run 1e7 trials.
> 
> Apart from hardware availability, there are two objections to really long
> runs:
> 
> 	- I am sure to change my program before it finishes the run.
> 	- It might be worse than doing 1e3 runs of 1e4 trials each, using
> smaller parameter families.
> 
> A lot depends on the correlations between parameters. For example, if
> parameters are independent then the potential benefit of tuning all
> concurrently is minimized. If the parameters are intricately related, then
> there could be great benefit in concurrent tuning, since tuning one
> parameter at a time might not see the big picture.
> 
> I am curious: can Clop exploit a covariance matrix to "factor" the samples
> into minimally interdependent sets? For example, in the case of fully
> independent parameters, then Clop could choose x0 by ignoring all
> coordinates but x0 in its sample.
> 
> The challenge is to create a scalable form of dimensionality reduction.
> 
> Brian

My plan is to investigate sparse methods in the future. That paper seems a good starting point:
http://en.scientificcommons.org/43266410
But I don't expect it can do miracles.

Allowing the user of CLOP to manually indicate groups of parameters that are expected to be correlated might be another approach.

I tried CLOP in the completely independent case (the Log5 problem) with independent quadratic regression (ie, all non-diagonal terms of the hessian are set to zero), and full quadratic regression. The results was presented in that CCC message:
http://www.talkchess.com/forum/viewtopic.php?topic_view=threads&p=423257&t=40237
So, independent regression is not a big win.

Right now, one major problem for using CLOP in high dimensions is the computational cost. That's because my implementation of quadratic logistic regression is not efficient. It is based on Newton's method. I will try conjugate gradient in the future, as it seems to be a much more efficient approach to logistic regression in high dimensions.

Rémi


More information about the Computer-go mailing list