New Features
Detailed Changelog: v0.10.1...v0.11.0
Interactively start and stop Model API Server during development
A new API was introduced in 0.11.0 for users to start and test an API server while developing their BentoService class:
service = MyPredictionService()
service.pack("model", model)
# Start an API model server in the background
service.start_dev_server(port=5000)
# Send test request to the server or open the URL in browser
requests.post(f'http://localhost:5000/predict', data=review, headers=headers)
# Stop the dev server
service.stop_dev_server()
# Modify code and repeat ♻️
Here's an example notebook showcasing this new feature.
More PyTorch eco-system Integrations
Logging is fully customizable now!
Users can now use one single YAML file to customize the logging behavior in BentoML, including the prediction logs and feedback logs.
https://docs.bentoml.org/en/latest/guides/logging.html
Two new configs are also introduced for quickly turning on/off console logging and file logging:
https://github.com/bentoml/BentoML/blob/v0.11.0/bentoml/configuration/default_bentoml.cfg#L29
[logging]
console_logging_enabled = true
file_logging_enabled = true
If you are not sure how this config works, here's a new guide on how BentoML's configuration works: https://docs.bentoml.org/en/latest/guides/configuration.html
More model management APIs
All model management CLI and Yatai client python API now supports the yatai_url
parameter, making it easy to interact with a remote YataiService, for centrally manage all your BentoML packaged ML models:
Support bundling zipimport modules #1261
Bundling zipmodules
with BentoML is possible now with this newly added API:
@bentoml.env(zipimport_archives=['nested_zipmodule.zip'])
@bentoml.artifacts([SklearnModelArtifact('model')])
class IrisClassifier(bentoml.BentoService):
...
BentoML also manages the sys.path
when loading a saved BentoService with zipimport archives, making sure the zip modules can be imported in user code.
Announcements
Monthly Community Meeting
Thank you again for everyone coming to the first community meeting this week! If you are not invited to the community meeting calendar yet, make sure to join it here: https://github.com/bentoml/BentoML/discussions/1396
Hiring
BentoML team is hiring multiple Software Engineer roles to help build the future of this open-source project and the business behind it - we are looking for someone with experience in one of the following areas: ML infrastructure, backend systems, data engineering, SRE, full-stack, and technical writing. Feel free to pass along the message to anyone you know who might be interested, we'd really appreciate that!