github Vali-98/ChatterUI v0.6

latest releases: v0.8.3, v0.8.3-beta5, v0.8.3-beta4...
12 months ago

v0.6

New feature: Added llama.rn backend to run models natively.

  • Simply download a model to your phone and Import it. (Note that it creates a duplicate of the model file within the app, so it may be wise to remove the downloaded model to save on space.)
  • Load the model to allocate it into memory and start chatting.
  • Prompt processing is very slow, expect it to take a few minutes on a new chat.
  • llama.rn does not have GPU support as of yet, so prompt processing will be very slow.

v0.6.1

  • Initial implementation of Open Router and Chat Completions, currently does not respect Instruct formatting.
  • Added System TTS

v0.6.2

  • Implemented alternate greetings.
  • Fixed issues with macro replacement.

v0.6.2a

  • Fixed macro replacement still not working.

v0.6.3

  • Optimized context building.
  • Added elipses when waiting for response.
  • Improved startup routine.

v0.6.3a

  • Fixed issues with local inferencing.

v0.6.4

  • Added dynatemp for KAI endpoint.
  • Allow newlines in Instruct presets.

v0.6.4a

  • Fixed a crash when entering CharInfo menu.

v0.6.5

  • Bumped llama.rn version to 0.3.0-rc.14
  • Added Seed value to local inferencing.

Don't miss a new ChatterUI release

NewReleases is sending notifications on new releases.