Detailed Changelog: v0.12.1...v0.13.0
Overview
BentoML 0.13.0 is here! It's a release packed with lots of new features and important bug fixes. We encourage all users to upgrade.
❤️ Contributors
Thanks to @aarnphm @andrewsi-z @larme @gregd33 @bojiang @ssheng @henrywu2019 @yubozhao @jack1902 @illy @sencenan @parano @soeque1 @elia-secchi @Shumpei-Kikuta @StevenReitsma @dsherry @AnvithaGadagi @joaquincabezas for the contributions!
📢 Breaking Changes
-
Configuration revamp
- The
bentoml config
CLI command has been fully deprecated in this release - New config system was introduced for configuring BentoML api server, yatai,
tracing and more (#1543, #1595, #1615, #1667) - Documentation: https://docs.bentoml.org/en/latest/guides/configuration.html
- Add --do-not-track CLI option and environment variable (#1534)
- The
-
Deprecated --enable-microbatch flag
- Use the
@api(batch=True|False)
option to choose between microbatch enabled
API vs. non-batch API - For API defined in batch mode but requires serving online traffic without
batching behavior, use--mb-max-batch-size=1
instead
- Use the
🎉 New Features
-
GPU Support
- GPU serving guide https://docs.bentoml.org/en/latest/guides/gpu_serving.html
- Added docker base image optimized for GPU serving (#1653)
-
Add support for EvalML (#1603)
-
Add support for ONNX-MLIR model (#1545)
-
Add full CORS support for bento API server (#1576)
-
Monitoring with Prometheus Gudie
-
Optimize BentoML import delay (#1608)
-
Support upload/download for Yatai backed by local file system storage (#1586)
🐞 Bug Fixes and Other Changes
-
Fix StringInput with batch=True API (#1581)
-
Fix docs.json link in API server UI (#1633)
-
Fix uploading to remote path (#1601)
-
Fix label missing after uploading Bento to remote Yatai (#1598)
-
Fixes /metrics endpoints with serve-gunicorn (#1666)
-
Upgrade conda to 4.9.2 in default docker base image (#1525)
-
Internal: