github XInTheDark/raycast-g4f v2.5

latest releases: v5.6, v5.5.1, v5.5...
14 months ago

We now support local inference with Ollama! Ollama lets you run various open-source LLMs locally, in a Local API. To use local inference, you need to have Ollama installed.

If you're interested, learn how to get started here! https://github.com/XInTheDark/raycast-g4f/wiki/Help-page:-Configure-Local-APIs#getting-started-ollama

In this update:

  • Added the "Ollama Local API" provider.
  • The "Configure GPT4Free Local API" command has been renamed to "Configure Local APIs", and it now manages the settings for both G4F and Ollama APIs.

What's Changed

Full Changelog: v2.4...v2.5

Don't miss a new raycast-g4f release

NewReleases is sending notifications on new releases.