pypi vllm 0.2.0
v0.2.0

latest releases: 0.6.1.post2, 0.6.1.post1, 0.6.1...
11 months ago

Major changes

  • Up to 60% performance improvement by optimizing de-tokenization and sampler
  • Initial support for AWQ (performance not optimized)
  • Support for RoPE scaling and LongChat
  • Support for Mistral-7B
  • Many bug fixes

What's Changed

New Contributors

Full Changelog: v0.1.7...v0.2.0

Don't miss a new vllm release

NewReleases is sending notifications on new releases.