github bentoml/BentoML v0.8.2
BentoML-0.8.2

latest releases: v1.2.12, v1.2.11, v1.2.10...
3 years ago

What's New?

  • Support Debian-slim docker images for containerizing model server, #822 by @jackyzha0. User can choose to use :

    @env(
       auto_pip_dependencies=True,
       docker_base_image="bentoml/model-server:0.8.2-slim-py37"
    )
  • New bentoml retrieve command for downloading saved bundle from remote YataiService model registry, #810 by @iancoffey

    bentoml retrieve ModelServe:20200610145522_D08399 --target_dir /tmp/modelserve
  • Added --print-location option to bentoml get command to print the saved path, #825 by @jackyzha0

     $ bentoml get IrisClassifier:20200625114130_F3480B --print-location
     /Users/chaoyu/bentoml/repository/IrisClassifier/20200625114130_F3480B
  • Support Dataframe input JSON format orient parameter. DataframeInput now supports all pandas JSON orient options: records, columns, values split, index. #809 #815, by @bojiang

    For example, with orient="records":

    @api(input=DataframeInput(orient="records"))
    def predict(self, df):
         ...

    The API endpoint will be expecting HTTP request with JSON payload in the following format:

    [{"col 1":"a","col 2":"b"},{"col 1":"c","col 2":"d"}]

    Or with orient="index":

    '{"row 1":{"col 1":"a","col 2":"b"},"row 2":{"col 1":"c","col 2":"d"}}'

    See pandas's documentation on the orient option of to_json/from_json function for more detail: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_json.html

  • Support Azure Functions deployment (beta). A new fully automated cloud deployment option that BentoML provides in addition to AWS SageMaker and AWS Lambda. See usage documentation here: https://docs.bentoml.org/en/latest/deployment/azure_functions.html

  • ModelServer API Swagger schema improvements including the ability to specify example HTTP request, #807 by @Korusuke

  • Add prediction logging when deploying with AWS Lambda, #790 by @jackyzha0

  • Artifact string name validation, #817 by @AlexDut

  • Fixed micro batching parameter(max latency and max batch size) not applied, #818 by @bojiang

  • Fixed issue with handling CSV file input by following RFC4180. #814 by @bojiang

  • Fixed TfTensorOutput casts floats as ints #813, in #823 by @bojiang

Announcements:

  • The BentoML team has created a new mailing list for future announcements, community-related discussions. Join now here!
  • For those interested in contributing to BentoML, there is a new contributing docs now, be sure to check it out.
  • We are starting a bi-weekly community meeting for community members to demo new features they are building, discuss the roadmap and gather feedback, etc. More details will be announced soon.

Don't miss a new BentoML release

NewReleases is sending notifications on new releases.