github oobabooga/text-generation-webui v1.5

14 months ago

What's Changed

  • Add a detailed extension example and update the extension docs. The example can be found here: example/script.py.
  • Introduce a new chat_input_modifier extension function and deprecate the old input_hijack.
  • Change rms_norm_eps to 5e-6 for llama-2-70b ggml all llama-2 models -- this value reduces the perplexities of the models.
  • Remove FlexGen support. It has been made obsolete by the lack of Llama support and the emergence of llama.cpp and 4-bit quantization. I can add it back if it ever gets updated.
  • Use the dark theme by default.
  • Set the correct instruction template for the model when switching from default/notebook modes to chat mode.

Bug fixes

  • [extensions/openai] Fixes for: embeddings, tokens, better errors. +Docs update, +Images, +logit_bias/logprobs, +more. by @matasonic in #3122
  • Fix typo in README.md by @eltociear in #3286
  • README updates and improvements by @netrunnereve in #3198
  • Ignore values in training.py which are not string by @Foxtr0t1337 in #3287

Don't miss a new text-generation-webui release

NewReleases is sending notifications on new releases.