🚀 Key Features / 关键更新
📦 LangBot Models / Models 服务
您专心构建,模型我们搞定:LangBot Models 服务现已上线,当您使用 LangBot Space 账号登录您的实例时,可用模型将被自动填入,无需进行任何配置即可起步。我们还对模型配置进行了重构,现在的设计更加合理直观,并已移动到独立的对话框中,简化 UI。
You focus on building, we'll handle the models: The LangBot Models service is now live. When you log in to your instance with your LangBot Space account, available models will be automatically populated, allowing you to get started without any configuration. We've also refactored the model configuration; the current design is more logical and intuitive, and it has been moved to a separate dialog box, simplifying the UI.
-
您随时可配置并使用其他来源的模型,任何功能都不会受到影响,但我们仍希望您将此项服务交给我们,帮助开源项目发展 :)
-
Available models: https://space.langbot.app/models
-
You are free to configure and use models from other sources without any functionalities being affected. However, we still encourage you to entrust this service to us to help the development of open-source projects :)
What's Changed
- feat: support configurable WeCom API base URL for reverse proxy deployment by @Copilot in #1890
- perf: replace copy button toast notifications with checkmark feedback by @Copilot in #1898
- refactor: model config dialog and introduce LangBot Models service integration by @RockChinQ in #1894
- fix: split Wecom messages exceeding 2048-byte limit by @njzjz in #1901
New Contributors
Full Changelog: v4.6.5...v4.7.0