github oobabooga/text-generation-webui v1.16

latest releases: v3.13, v3.12, v3.11...
11 months ago

Backend updates

  • Transformers: bump to 4.46.
  • Accelerate: bump to 1.0.

Changes

  • Add whisper turbo (#6423). Thanks @SeanScripts.
  • Add RWKV-World instruction template (#6456). Thanks @MollySophia.
  • Minor Documentation update - query cuda compute for docker .env (#6469). Thanks @practical-dreamer.
  • Remove lm_eval and optimum from requirements (they don't seem to be necessary anymore).

Bug fixes

  • Fix llama.cpp loader not being random. Thanks @reydeljuego12345.
  • Fix temperature_last when temperature not in sampler priority (#6439). Thanks @ThisIsPIRI.
  • Make token bans work again on HF loaders (#6488). Thanks @ThisIsPIRI.
  • Fix for systems that have bash in a non-standard directory (#6428). Thanks @LuNeder.
  • Fix intel bug described in #6253 (#6433). Thanks @schorschie.
  • Fix locally compiled llama-cpp-python failing to import.

Don't miss a new text-generation-webui release

NewReleases is sending notifications on new releases.