github LostRuins/koboldcpp v1.25.1
koboldcpp-1.25.1

latest releases: v1.78, v1.77, v1.76...
18 months ago

koboldcpp-1.25.1

KoboldCpp Changes:

  • Add a new Failsafe mode, triggered by running --noavx2 --noblas --nommap which disables all CPU intrinsics, allowing even ancient devices with no AVX or SSE support to run KoboldCpp, though they will be extremely slow.
  • Fixed a bug in the GUI that selected noavx2 mode incorrectly.
  • Pulled new changes for other non-llama architectures. In particular, the GPT Tokenizer has been improved.
  • Added support for setting the sampler_seed via the /generate API. Please refer to KoboldAI API documentation for details.
  • Pulled upstream fixes and enhancements, and compile fixes for other architectures.
  • Added more console logging in --debugmode which can now display the context token contents.

Edit: v1.25.1

  • Changed python for pyinstaller from 3.9 to 3.8. Combined with a change in failsafe mode that avoids PrefetchVirtualMemory, failsafe mode should now work in Windows 7! To use it, run with --noavx2 --noblas --nommap and failsafe mode will trigger.
  • Upgraded CLBlast to 1.6

Kobold Lite UI Changes:

  • Kobold Lite UI now supports variable streaming lengths (defaults to 8 tokens), you can see by adding ?streamamount=[value] to the URL after launching with --stream
  • Removed newlines from automatically being inserted into the very start of chat scenarios. The chat regex has been slightly adjusted.
  • Above change was reverted as it was buggy.
  • Remove default Alpaca instruction prompt as it was less useful on newer instruct models. You can still use it by adding it to Memory.
  • Fixed an autosave bug which happened sometimes when disconnecting while using Lite.
  • Greatly improved markdown support
  • Added drag and drop load file functionality

To use, download and run the koboldcpp.exe, which is a one-file pyinstaller.
Alternatively, drag and drop a compatible ggml model on top of the .exe, or run it and manually select the model in the popup dialog.

and then once loaded, you can connect like this (or use the full koboldai client):
http://localhost:5001

For more information, be sure to run the program with the --help flag.
This release also includes a zip file containing the libraries and the koboldcpp.py script, for those who prefer not use to the one-file pyinstaller.

Don't miss a new koboldcpp release

NewReleases is sending notifications on new releases.