github oobabooga/text-generation-webui v1.1

16 months ago

Changes

  • Bump bitsandbytes Windows wheel by @jllllll in #3097 -- --load-in-4bit is now a lot faster
  • Add support low vram mode on llama.cpp module by @gabriel-pena in #3076
  • Add links/reference to new multimodal instructblip-pipeline in multimodal readme by @kjerk in #2947
  • Add token authorization for downloading model by @fahadh4ilyas in #3067
  • Add default environment variable values to docker compose file by @Josh-XT in #3102
  • models/config.yaml: +platypus/gplatty, +longchat, +vicuna-33b, +Redmond-Hermes-Coder, +wizardcoder, +more by @matatonic in #2928
  • Add context_instruct to API. Load default model instruction template … by @atriantafy in #2688
  • Chat history download creates more detailed file names by @UnskilledWolf in #3051
  • Disable wandb remote HTTP requests
  • Add Feature to Log Sample of Training Dataset for Inspection by @practicaldreamer in #1711
  • Add ability to load all text files from a subdirectory for training by @kizinfo in #1997
  • Add Tensorboard/Weights and biases integration for training by @kabachuha in #2624
  • Fix: Fixed the tokenization process of a raw dataset and improved its efficiency by @Nan-Do in #3035
  • More robust and error prone training by @FartyPants in #3058

Bug fixes

Extensions

Don't miss a new text-generation-webui release

NewReleases is sending notifications on new releases.