github zilliztech/GPTCache 0.1.20
v0.1.20

latest releases: 0.1.44, 0.1.43, 0.1.42...
17 months ago

🎉 Introduction to new functions of GPTCache

  1. support the temperature param, like openai

A non-negative number of sampling temperature, defaults to 0.
A higher temperature makes the output more random.
A lower temperature means a more deterministic and confident output.

  1. Add llama adapter
from gptcache.adapter.llama_cpp import Llama

llm = Llama('./models/7B/ggml-model.bin')
answer = llm(prompt=question)

Full Changelog: 0.1.19...0.1.20

Don't miss a new GPTCache release

NewReleases is sending notifications on new releases.