github bentoml/BentoML v1.0.18
BentoML - v1.0.18

latest releases: v1.3.5, v1.3.4post1, v1.3.4...
17 months ago

🍱 BentoML v1.0.18 brings a new way of creating the server and client natively from Python.

  • Start an HTTP or gRPC server and client asynchronously with a context manager.

    server = HTTPServer("iris_classifier:latest", production=True, port=3000)
    
    # Start the server in a separate process and connect to it using a client
    with server.start() as client:
        res = client.classify(np.array([[4.9, 3.0, 1.4, 0.2]]))
  • Start an HTTP or gRPC server synchronously.

    server = HTTPServer("iris_classifier:latest", production=True, port=3000)
    server.start(blocking=True)
  • As always, a client can be created and connected to an running server.

    client = Client.from_url("http://localhost:3000")
    res = client.classify(np.array([[4.9, 3.0, 1.4, 0.2]]))

What's Changed

  • chore(deps): bump coverage[toml] from 7.2.2 to 7.2.3 by @dependabot in #3746
  • bugs: Fix an f-string bug in Tranformers framework. by @ssheng in #3753
  • chore(deps): bump pytest from 7.2.2 to 7.3.0 by @dependabot in #3751
  • chore(deps): bump bufbuild/buf-setup-action from 1.16.0 to 1.17.0 by @dependabot in #3750
  • fix: BufferError when pushing model to BentoCloud by @aarnphm in #3737
  • chore: remove codecov dependencies by @aarnphm in #3754
  • feat: implement new serve API by @sauyon in #3696
  • examples: Add a client example to quickstart by @ssheng in #3752

Full Changelog: v1.0.17...v1.0.18

Don't miss a new BentoML release

NewReleases is sending notifications on new releases.