Highlights
- We fixed an configuration incompatibility between vLLM (which tested against pre-released version) and the published Meta Llama 3.1 weights (#6693)
What's Changed
- [Docs] Announce llama3.1 support by @WoosukKwon in #6688
- [doc][distributed] fix doc argument order by @youkaichao in #6691
- [Bugfix] Fix a log error in chunked prefill by @WoosukKwon in #6694
- [BugFix] Fix RoPE error in Llama 3.1 by @WoosukKwon in #6693
- Bump version to 0.5.3.post1 by @simon-mo in #6696
Full Changelog: v0.5.3...v0.5.3.post1