github bentoml/BentoML v0.12.0
BentoML-0.12.0

latest releases: v1.3.12, v1.3.11, v1.3.10...
3 years ago

Detailed Changelog: v0.11.0...v0.12.0

New Features

  • Breaking Change: Default Model Worker count is set to one #1454

    • Please use the --worker CLI argument for specifying a number of workers of your deployment
    • For heavy production workload, we recommend experiment with different worker count and benchmark test your BentoML service in API server in your target hardware to get a better understanding of the model server performance
  • Breaking Change: Micro-batching layer(Marshal Server) is now enabled by default #1498

    • For Inference APIs defined withbatch=True, this will enable micro-batching behavior when serving. User can disable with the --diable-microbatch flag
    • For Inference APIs with batch=False, API requests are now being queued in Marshal and then forwarded to the model backend server
  • New: Use non-root user in BentoML's API server docker image

  • New: API/CLI for bulk delete of BentoML bundle in Yatai #1313

  • Easier dependency management for PyPI and conda

    • Support all pip install options via a user-provided requirements.txt file
    • Breaking Change: when requirements_txt_file option is in use, other pip package options will be ignored
    • conda_override_channels option for using explicit conda channel for conda dependencies: https://docs.bentoml.org/en/latest/concepts.html#conda-packages

  • Better support for pip install options and remote python dependencies #1421
  1. Let BentoML do it for you:
@bentoml.env(infer_pip_packages=True)
  1. use the existing "pip_packages" API, to specify list of dependencies:
@bentoml.env(
    pip_packages=[
      'scikit-learn',
      'pandas @https://github.com/pypa/pip/archive/1.3.1.zip',
    ]
)
  1. use a requirements.txt file to specify all dependencies:
@bentoml.env(requirements_txt_file='./requirements.txt')

In the ./requirements.txt file, all pip install options can be used:

#
# These requirements were autogenerated by pipenv
# To regenerate from the project's Pipfile, run:
#
#    pipenv lock --requirements
#

-i https://pypi.org/simple

scikit-learn==0.20.3
aws-sam-cli==0.33.1
psycopg2-binary
azure-cli
bentoml
pandas @https://github.com/pypa/pip/archive/1.3.1.zip

https://[username[:password]@]pypi.company.com/simple
https://user:he%2F%2Fo@pypi.company.com

git+https://myvcs.com/some_dependency@sometag#egg=SomeDependency
  • API/CLI for bulk delete #1313

CLI command for delete:

# Delete all saved Bento with specific name
bentoml delete --name IrisClassifier
bentoml delete --name IrisClassifier -y # do it without confirming with user
bentoml delete --name IrisClassifier --yatai-url=yatai.mycompany.com # delete in remote Yatai

# Delete all saved Bento with specific tag
bentoml delete --labels "env=dev"
bentoml delete --labels "env=dev, user=foobar"
bentoml delete --labels "key1=value1, key2!=value2, key3 In (value3, value3a), key4 DoesNotExist"

# Delete multiple saved Bento by their name:version tag
bentoml delete --tag "IrisClassifier:v1, MyService:v3, FooBar:20200103_Lkj81a"

# Delete all
bentoml delete --all

Yatai Client Python API:

yc = get_yatai_client() # local Yatai
yc = get_yatai_client('remote.yatai.com:50051') # remoate Yatai

yc.repository.delete(prune, labels, bento_tag, bento_name, bento_version, require_confirm)

"""
Params:
prune: boolean, Set true to delete all bento services
bento_tag: Bento tag
labels: string, label selector to filter bento services to delete
bento_name: string 
bento_version: string, 
require_confirm: boolean require user confirm interactively in CLI
"""
  • #1334 Customize route of an API endpoint
@env(infer_pip_packages=True)
@artifacts([...])
class MyPredictionService(BentoService)

   @api(route="/my_url_route/foo/bar", batch=True, input=DataframeInput())
   def predict(self, df):
     # instead of "/predict", the URL for this API endpoint will be "/my_url_route/foo/bar"
     ...
  • #1416 Support custom authentication header in Yatai gRPC server
  • #1284 Add health check endpoint to Yatai web server
  • #1409 Fix Postgres disconnect issue with Yatai server

Don't miss a new BentoML release

NewReleases is sending notifications on new releases.