github InternLM/lmdeploy v0.0.10
LMDeploy Release V0.0.10

latest releases: v0.10.2, v0.10.1, v0.10.0...
2 years ago

What's Changed

💥 Improvements

🐞 Bug fixes

  • Fix side effect brought by supporting codellama: sequence_start is always true when calling model.get_prompt by @lvhan028 in #466
  • Miss meta instruction of internlm-chat model by @lvhan028 in #470
  • [bug] Fix race condition by @akhoroshev in #460
  • Fix compatibility issues with Pydantic 2 by @aisensiy in #465
  • fix benchmark serving cannot use Qwen tokenizer by @AllentDan in #443
  • Fix memory leak by @lvhan028 in #488

📚 Documentations

🌐 Other

New Contributors

Full Changelog: v0.0.9...v0.0.10

Don't miss a new lmdeploy release

NewReleases is sending notifications on new releases.