github bentoml/BentoML v0.10.0
BentoML-0.10.0

latest releases: v1.2.16, v1.2.15, v1.2.14...
3 years ago

New Features & Improvements

from bentoml.yatai.client import get_yatai_client

bento_service.save() # Save and register the bento service locally

# push to save bento service to remote yatai service.
yc = get_yatai_client('http://staging.yatai.mycompany.com:50050')
yc.repository.push(
    f'{bento_service.name}:{bento_service.version}',
) 

# Pull bento service from remote yatai server and register locally
yc = get_yatai_client('http://staging.yatai.mycompany.com:50050')
yc.repository.pull(
    'bento_name:version',
)

#delete in local yatai
yatai_client = get_yatai_client()
yatai_client.repository.delete('name:version')

# delete in batch by labels
yatai_client = get_yatai_client()
yatai_client.prune(labels='cicd=failed, framework In (sklearn, xgboost)')

# Get bento service metadata
yatai_client.repository.get('bento_name:version', yatai_url='http://staging.yatai.mycompany.com:50050')

# List bento services by label
yatai_client.repositorylist(labels='label_key In (value1, value2), label_key2 Exists', yatai_url='http://staging.yatai.mycompany.com:50050')

New CLI commands for model management:
Push local bento service to remote yatai service:

$ bentoml push bento_service_name:version --yatai-url http://staging.yatai.mycompany.com:50050

Added --yatai-url option for the following CLI commands to interact with remote yatai service directly:

bentoml get
bentoml list
bentoml delete
bentoml retrieve
bentoml run
bentoml serve
bentoml serve-gunicorn
bentoml info
bentoml containerize
bentoml open-api-spec
  • Model Metadata API #1179 shoutout to @jackyzha0 for designing and building this feature!
    Ability to save additional metadata for any artifact type, e.g.:
    model_metadata = {
        'k1': 'v1',
        'job_id': 'ABC',
        'score': 0.84,
        'datasets': ['A', 'B'],
    }
    svc.pack("model", test_model, metadata=model_metadata)

    svc.save_to_dir(str(tmpdir))
    loaded_service = bentoml.load(str(tmpdir))
    print(loaded_service.artifacts.get('model').metadata)
  • Improved Tensorflow Support, by @bojiang

    • Make the packed model behave the same as after the model was saved and loaded again #1231
    • TfTensorOutput raise TypeError when micro-batch enabled #1251
    • Opt auto casting of TfSavedModelArtifact & clearer feedback
    • Improve KerasModelArtifact to work with tf2 #1295
  • Automated AWS EC2 deployment #1160 massive 3800+ line PR by @mayurnewase

  • Add MXNet Gluon support #1264 by @liusy182

  • Enable input & output data capture in Sagemaker deployment #1189 by @j-hartshorn

  • Faster docker image rebuild when only model artifacts are updated #1199

  • Support URL location prefix in yatai-service gRPC/Web server #1063 #1184

  • Support relative path for showing Swagger UI page in the model server #1207

  • Add onnxruntime gpu as supported backend #1213

  • Add option to disable swagger UI #1244 by @liusy182

  • Add label and artifact metadata display to yatai web ui #1249

  • Make bentoml module executable #1274

python -m bentoml <subcommand>
  • Allow setting micro batching parameters from CLI #1282 by @jsemric
bentoml serve-gunicorn --enable-microbatch --mb-max-latency 3333 --mb-max-batch-size 3333 IrisClassifier:20201202154246_C8DC0A                                                                                                                                   

Bug fixes

  • Allow deleting bento that was previously deleted with the same name and version #1211
  • Construct docker API client from env #1233
  • Pin-down SqlAlchemy version #1238
  • Avoid potential TypeError in batching server #1252
  • Fix inference API docstring override by default #1302

Documentation

  • Add examples of queries with requests for adapters #1202
  • Update import paths to reflect fastai2->fastai rename #1227
  • Add model artifact metadata information to the core concept page #1259
  • Update adapters.rst to include new input adapters #1269
  • Update quickstart guide #1262
  • Docs for gluon support #1271
  • Fix CURL commands for posting files in input adapters doc string #1307

Internal, CI, and Tests

  • Fix installing bundled pip dependencies in Azure and Sagemaker deployments #1214 (affects bentoml developers only)
  • Add Integration test for Fasttext #1221
  • Add integration test for spaCy #1236
  • Add integration test for models using tf native API #1245
  • Add tests for run_api_server_docker_container microbatch #1247
  • Add integration test for LightGBM #1243
  • Update Yatai web ui node dependencies version #1256
  • Add integration test for bento management #1263
  • Add yatai server integration tests to Github CI #1265
  • Update e2e yatai service tests #1266
  • Include additional information for EC2 test #1270
  • Refactor CI for TensorFlow2 #1277
  • Make tensorflow integration tests run faster #1278
  • Fix overrided protobuf version in CI #1286
  • Add integration test for tf1 #1285
  • Refactor yatai service integration test #1290
  • Refactor Saved Bundle Loader #1291
  • Fix flaky yatai service integration tests #1298
  • Refine KerasModelArtifact & its integration test #1295
  • Improve API server integration tests #1299
  • Add integration tests for ragged_tensor #1303

Announcements

  • We have started using Github Projects feature to track roadmap items for BentoML, you can find it here: https://github.com/bentoml/BentoML/projects/1
  • We are hiring senior engineers and a lead developer advocate to join our team, let us know if you or someone you know might be interested 👉 contact@bentoml.ai
  • Apologize for the long wait between 0.9 and 0.10 releases, we are getting back to doing our bi-weekly release schedule now! We need help with documenting new features, writing release notes as well as QA new release before it went out, let us know if you'd be interested in helping out!

Thank you everyone for contributing to this release! @j-hartshorn @withsmilo @yubozhao @bojiang @changhw01 @mayurnewase @telescopic @jackyzha0 @pncnmnp @kishore-ganesh @rhbian @liusy182 @awalvie @cathy-kim @jsemric 🎉🎉🎉

Don't miss a new BentoML release

NewReleases is sending notifications on new releases.