github LostRuins/koboldcpp v1.0.1
llamacpp-for-kobold-1.0.1

latest releases: v1.77, v1.76, v1.75.2...
20 months ago

llamacpp-for-kobold-1.0.1

  • Bugfixes for OSX, and KV caching allows continuing a previous generation without reprocessing the whole prompt
  • Weights not included.

To use, download, extract and run (defaults port is 5001):
llama_for_kobold.py [ggml_quant_model.bin] [port]

and then you can connect like this (or use the full koboldai client):
https://lite.koboldai.net/?local=1&port=5001

Don't miss a new koboldcpp release

NewReleases is sending notifications on new releases.