github twinnydotdev/twinny v3.0.0

latest releases: v3.22.6, v3.22.5, v3.22.4...
12 months ago
  • Added support hosted llama.cpp servers
  • Added configuration options for separate FIM and Chat completion server endpoints as llama.cpp server can only host one model at a time and fim/chat don't work interchangeably with the same model
  • Some settings have been re-named but the defaults stay the same
  • Remove support for deepseek models as was causing code smell inside the prompt templates (need to improve model support)

Don't miss a new twinny release

NewReleases is sending notifications on new releases.