github chigkim/VOLlama v0.2.2.40
VOLlama v0.2.2

latest releases: v0.7.0.49, v0.6.0.47, v0.5.0.46...
3 months ago

Change log

  • Correctly map num_predict to max_tokens.
  • Important: Set num_predict to a positive value, such as 1024.
  • Parameters (except num_ctx) can be left empty to use the engine’s default values.
  • The system message can be empty to use the model’s default.
  • You can now specify the num_gpu parameter to set the number of layers loaded to the GPU for Ollama.

Download

Don't miss a new VOLlama release

NewReleases is sending notifications on new releases.