github lightvector/KataGo v1.3.3
New Nets, Friendlier Configuration, Faster Model Loading, KGS Support

latest releases: v1.14.1, v1.14.0, v1.13.2-kata9x9...
4 years ago

If you're a new user, don't forget to check out this section for getting started and basic usage!

More Neural Nets!

After an unfortunately-long pause for a large chunk of February in which KataGo was not able to continue training due to hardware/logistical issues, KataGo's run has resumed!

  • g170-b20c256x2-s2107843328-d468617949 ("g170 20 block s2.11G") - This is the final 20-block net that was used in self-play for KataGo's current run, prior to switching to larger nets. It might be very slightly stronger than the 20 block net in the prior release, "g170 20 block s1.91G".

  • g170-b30c320x2-s1287828224-d525929064 ("g170 30 block s1.29G") - Bigger 30-block neural net! This is one of the larger sizes that KataGo is now attempting to train. Per-playout, this net should be noticeably stronger than prior nets, perhaps as much as 140 Elo stronger than "s1.91G". However, at least at low-thousands of playouts it is not as strong yet per-equal-compute-time. But the run is still ongoing. We'll see how things develop in the coming weeks/months!

  • g170-b40c256x2-s1349368064-d524332537 ("g170 40 block s1.35G") - A 40-block neural net, but with fewer channels than the 30-block net! This is the other of the larger sizes that KataGo is now attempting to train. Same thing for this one - should be stronger at equal playouts, but weaker at equal compute for modest amounts of compute.

  • g170e-b20c256x2-s2430231552-d525879064 ("g170e 20 block s2.43G") - We're continuing to extendedly-train the 20-block net on the games generated by the larger nets, even though it is not being used for self-play any more. This net might be somewhere around 70 Elo stronger than "s1.91G" by some rough tests.

Per playout, and in terms of raw judgment, either the 30-block or 40-block net should be the strongest KataGo net so far, but per compute time, the 20-block extended-training "s2.43G" is likely the strongest net. Extensive testing and comparison has not been done yet though.

The latter three nets are attached below. If you want the first one, or for all other currently-released g170 nets, take a look here: https://d3dndmfyhecmj0.cloudfront.net/g170/neuralnets/index.html

New Model Format

Starting with this release, KataGo is moving to a new model format which is a bit smaller on disk and faster to load, indicated by a new file extension".bin.gz" instead of ".txt.gz". The new format will NOT work with earlier KataGo versions. However, the version 1.3.3 in this release will still be able to load all older models.

If you are using some of the older/smaller nets from this run (for example, the much faster 10 or 15-block extended-training nets) and would like to get ".bin.gz" versions of prior nets, they are also available at: https://d3dndmfyhecmj0.cloudfront.net/g170/neuralnets/index.html

Other Changes this Release

Configuration and user-friendliness

  • There is a new top-level subcommand that can be used to automatically tune and generate a GTP config, editing the rules, thread settings, and memory usage settings within the config for you, based on your preferences: ./katago genconfig -model <NEURALNET>.gz -output <NAME_OF_NEW_GTP_CONFIG>.cfg. Hopefully this helps newer users, or people trying to set up things on behalf of newer users!

  • All the rules-related options in the GTP config can now be replaced with just a single line rules=chinese or rules=japanese or rules=tromp-taylor or other possible values if desired! As demonstrated in gtp_example.cfg. See the documentation for kata-set-rules here for what rules are possible besides those, and see here for a formal description of KataGo's full ruleset.

  • katago gtp now has a new argument -override-config KEY=VALUE,KEY=VALUE,... that can be used to specify or override arbitrary values in the GTP config on the command line.

  • OpenCL version will now detect CPU-based OpenCL devices, and might run on some pure CPU machines now with no GPU.

GTP extensions

  • KataGo now supports KGS's GTP extension commands kgs-rules and kgs-time_settings. They can be used to set KataGo's rules to the settings necessary for the possible rules that KGS games can be played under, as well as traditional Japanese-style byo-yomi that is very popular on a large number of online servers. See here for some documentation on KataGo's implementation of these commands.

  • Added kata-raw-nn GTP extension to dump raw evaluations of KataGo's neural net, documentation in the usual place.

Misc

  • Added a mild hack to fix some instability in some neural nets involving passing near the very end if the game that could cause the reported value to erroneously fluctuate by a percent or two.

  • For those who run self-play training, a new first argument is required for shuffle_and_export_loop.sh and/or export_model_for_selfplay.sh - you should provide a globally unique prefix to distinguish your models in any given run from any other run, including ideally those of other users. This prefix gets displayed in logs, so that if you share your models with others, users can know which model is from where.

  • Various other minor changes and cleanups.

Edit (2020-02-28) - fixed a bug where if for some reason you tried to ungzip the .bin.gz model files instead of loading them directly, the raw .bin could not be loaded. Bumped the release tag and updated the executables.

Don't miss a new KataGo release

NewReleases is sending notifications on new releases.