Feature
- Ollama Support :)
- Please keep in mind that local LLMs may have unexpected results. For this version, a smaller context window is ideally better.
- Initial response will give a whitespace error, please continue the conversation to see if it runs. (I will fix this in the next version)
/stop
or/s
command - stop ongoing response.
Other Notes
- Refactor aliases to support all models
- Anthropic has low support. Access to API is limited.
- LocalAI has low support due to favoring Ollama for its simplicity.