github oobabooga/text-generation-webui v1.13

latest releases: v3.12, v3.11, v3.10...
13 months ago

Backend updates

  • llama-cpp-python: bump to 0.2.85 (adds Llama 3.1 support).

UI updates

  • Make compress_pos_emb float (#6276). Thanks @hocjordan.
  • Make n_ctx, max_seq_len, and truncation_length numbers rather than sliders, to make it possible to type the context length manually.
  • Improve the style of headings in chat messages.
  • LaTeX rendering:
    • Add back single $ for inline equations.
    • Fix rendering for equations enclosed between \[ and \].
    • Fix rendering for multiline equations.

Bug fixes

  • Fix saving characters through the UI.
  • Fix instruct mode displaying "quotes" as ""double quotes"".
  • Fix chat sometimes not scrolling down after sending a message.
  • Fix the chat "stop" event.
  • Make --idle-timeout work for API requests.

Other changes

  • Model downloader: improve the progress bar by adding the filename, size, and download speed for each downloaded file.
  • Better handle the Llama 3.1 Jinja2 template by not including its optional "tools" headers.

Don't miss a new text-generation-webui release

NewReleases is sending notifications on new releases.