[Computer-go] Scaling, randomness and long thinking times Part II
3-Hirn-Verlag at gmx.de
Tue Jun 21 10:26:08 PDT 2011
> >> So Valkyria has a hash table which only stores the best move for the
> >> position with a 64 bit hash (32 bits are not enough!) and log2 of the
> >> number of visits (measuring quality of move) and log2 of the search
> >> depth. Entries in the table is overwritten if quality is bad and depth
> >> is deep. There are 4 million entries in the table and written to a
> >> file the size is 50 MB.
> > Is there a reason to have such a small limit on the hash table size?
> > In computer chess, hash tables are typically in the Gigabyte range.
> Because I use the rest of memory for the search tree. And it is
> sufficently large because a MCTS will not produce as many evaluations
> as a Chess program. I do not store all moves because most leaf nodes
> in the tree will never be visited again.
> Also I assume that a Gigabyte chess table is not stored to a file. Or
> are they? I never used chess program so I do not know.
I do not know, too. I only know that performance-oriented users need
PC's with large memory.
> If I play a game on CGOS with an empty table I think all nodes
> generated for the entire game will fit in the hash table if I remember
With my question I had in mind mainly the slow games on LittleGolem,
where the bot gets 12 hours to think about a position.
> In short: I do not think I would get more performance if the tables
> were larger
Did you check that at 12-hour searches?
Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir
belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de
More information about the Computer-go