github LostRuins/koboldcpp v1.14
koboldcpp-1.14

latest releases: v1.77, v1.76, v1.75.2...
18 months ago

koboldcpp-1.14

  • Added backwards compatibility for an older version of NeoX with different quantizations
  • Fixed a few scenarios where users may encounter OOM crashes
  • Pulled upstream updates

To use, download and run the koboldcpp.exe, which is a one-file pyinstaller.
Alternatively, drag and drop a compatible ggml model on top of the .exe, or run it and manually select the model in the popup dialog.

and then once loaded, you can connect like this (or use the full koboldai client):
http://localhost:5001

For more information, be sure to run the program with the --help flag.

Alternative Options:
Non-AVX2 version now included in the same .exe file, enable with --noavx2 flags
Big context too slow? Try the --smartcontext flag to reduce prompt processing frequency
Run with your GPU using CLBlast, with --useclblast flag for a speedup

Disclaimer: This version has Cloudflare Insights in the Kobold Lite UI, which was subsequently removed in v1.17

Don't miss a new koboldcpp release

NewReleases is sending notifications on new releases.