github mostlygeek/llama-swap v0.0.4

latest releases: v158, v157, v156...
11 months ago

This release adds support for configuring a custom endpoint to check when the upstream server is ready. No more llama.cpp server's /health endpoint hardcoded as a dependency. It should work now with anything that provides an OpenAI compatible API.

Changelog

  • 6cf0962 Add custom check endpoint
  • 8eb5b7b Add custom check endpoint
  • 5a57688 add .vscode to .gitignore
  • b79b7ef add goreleaser config to limit GOOS and GOARCH builds

Don't miss a new llama-swap release

NewReleases is sending notifications on new releases.