🚀 DeepChat 0.4.5 正式发布 | 重新定义你的 AI 对话体验!
—— 不再是简单的 ChatBot,而是你的自然语言 Agent 工具🌟
-
Added request trace/debug feature for LLM messages in developer mode: You can now inspect actual request parameters sent to providers via a Trace button (bug icon) in the message toolbar (DEV only).
-
Request preview reconstructs endpoint, headers, and body, with sensitive information (API keys, tokens, passwords) redacted for safety.
-
UI/UX: TraceDialog supports JSON copy, multi-state (loading/error/not-implemented/success), and localizes trace tooltip and dialog labels.
-
Provider support: All OpenAI-compatible providers (and all 23+ derived providers) support this debug trace; Anthropic, Gemini, AWS Bedrock, Ollama, Github Copilot not implemented yet.
-
Refactored and standardized image block data structure.
-
Updated release artifact URL handling logic in workflow.
-
Added toggle event for trace debug config and related backend logic.
-
General bugfixes: Model config refresh toggle, tool call handling, etc.
-
Documentation: Added developer overview for Trace Request Parameters feature and usage.
-
新增开发者模式下的 请求参数调试功能:在消息工具栏点击 bug 图标可查看发送给 LLM 服务商的实际请求参数(开发模式独享)。
-
请求参数预览 能重构接口地址、请求头和主体参数,并对敏感信息(如API密钥、令牌、密码等)做安全脱敏处理。
-
界面优化:TraceDialog 支持 JSON 一键复制、多种状态展示(加载中/错误/未实现/成功),Trace 相关提示支持中英双语本地化。
-
服务商支持:所有基于 OpenAI 兼容接口的模型厂商(及其 23+ 衍生商)均已支持此调试;Anthropic、Gemini、AWS Bedrock、Ollama、Github Copilot 尚未实现。
-
重新规范化了图片块数据结构。
-
工作流逻辑完善,支持发布物 URL 自动更新。
-
新增 Trace 调试配置切换事件及后端逻辑。
-
常规修复:模型配置刷新选项、工具调用处理等。
-
文档补充:详细介绍 Trace 请求参数功能及用法。
🔥 为什么选择 DeepChat?
✅ 商业友好:基于原版 Apache License 2.0 开源,无任何协议外的额外约束,面向开源。
✅ 开箱即用:极简配置,即刻开启你的智能对话之旅。
✅ 极致灵活:自由切换模型,自定义模型源,满足你多样化的对话和探索需求。
✅ 体验绝佳:LaTeX 公式渲染、代码高亮、Markdown 支持,模型对话从未如此顺畅。
✅ 持续进化:我们倾听用户反馈,不断迭代更新,为你带来更卓越的 AI 对话体验。
📥 立即体验未来
💬 反馈有礼:欢迎提交你的宝贵建议,加入 VIP 用户社群,与我们一同塑造 DeepChat 的未来!
What's Changed
- feat(backup): Break change redesign data backup and import by @zerob13 in #1052
- fix(settings): await stored models retrieval from provider by @hllshiro in #1056
- feat: add nowledge mem mcp by @zerob13 in #1054
- chore: upgrade vue-renderer-markdown by @Simon-He95 in #1058
- feat: add empty system prompt option by @zerob13 in #1057
- chore: clean remove unless deps by @Simon-He95 in #1059
- feat: refactor download functionality and improve artifact handling by @Simon-He95 in #1060
- refactor(thread): extract conversation lifecycle by @zerob13 in #1061
- fix: restore MCP permission approval flow by @zerob13 in #1064
- fix: custom prompt sync after setting by @zerob13 in #1065
- feat: support MCP sampling requests by @zerob13 in #1062
- chore: add oxfmt as prettier plugin by @zerob13 in #1066
- feat: enhance category index management in MentionList component by @Simon-He95 in #1068
- chore: upgrade vue-renderer-markdown to fix mathematical formula matrix rendering by @Simon-He95 in #1069
- feat: add loading status for mcp toggle in chatinput by @zerob13 in #1070
- perf: remove unless renderMarkdown wrapper by @Simon-He95 in #1071
- refactor: clean code and remove unless code by @Simon-He95 in #1075
- fix: #1067 shortcut failed by @zerob13 in #1074
- fix: custom provider add refresh-model by @zerob13 in #1079
- fix: add tool call context for better conv by @zerob13 in #1081
- fix: update
tag_namefor release artifact urls by @chenrui333 in #1084 - refactor: standardize image block data structure by @Dw9 in #1082
- feat: add request trace for llm by @zerob13 in #1085
- fix: toggle model config refresh by @zerob13 in #1086
New Contributors
- @chenrui333 made their first contribution in #1084
Full Changelog: v0.4.3...v0.4.5