github LostRuins/koboldcpp v1.0.4
llamacpp-for-kobold-1.0.4

latest releases: v1.75.2, v1.75.1, v1.75...
18 months ago

llamacpp-for-kobold-1.0.4

  • Added a script to make standalone pyinstaller .exes, which will be used for all future releases. The llamacpp.dll and llama-for-kobold.py files are still available by cloning the repo and will be included and updated there.
  • Added token caching for prompts, allowing fast forwarding through partially duplicated prompts. This make edits towards the end of the previous prompt much faster.
  • Merged improvements from parent repo.
  • Weights not included.

To use, download and run the llamacpp_for_kobold.exe
Alternatively, drag and drop a compatible quantized model for llamacpp on top of the .exe, or run it and manually select the model in the popup dialog.

and then once loaded, you can connect like this (or use the full koboldai client):
http://localhost:5001

Don't miss a new koboldcpp release

NewReleases is sending notifications on new releases.