github openvinotoolkit/openvino 2020.4

latest releases: 2024.4.1, 2024.4.0, 2024.3.0...
4 years ago

Release notes: https://software.intel.com/content/www/us/en/develop/articles/openvino-relnotes.html

What's New

  • Improves performance while maintaining accuracy close to full precision (for example, FP32 data type) by introducing support for the Bfloat16 data type for inferencing using the 3rd generation Intel® Xeon® Scalable processor (formerly code-named Cooper Lake).
  • Increases accuracy when layers have varying bit-widths by extending the Post-Training Optimization Tool to support mixed-precision quantization.
  • Allows greater compatibility of models by supporting directly reading Open Neural Network Exchange (ONNX*) model format to the Inference Engine.
    • For users looking to take full advantage of Intel® Distribution of OpenVINO™ toolkit, it is recommended to follow the native workflow of using the Intermediate Representation from the Model Optimizer as input to the Inference Engine.
    • For users looking to more easily take a converted model in ONNX model format (for example, PyTorch to ONNX using torch.onnx), they are now able to input the ONNX format directly to the Inference Engine to run models on Intel architecture.
  • Enables initial support for TensorFlow* 2.2.0 for computer vision use cases.
  • Enables users to connect to and profile multiple remote hosts; collect and store data in one place for further analysis by extending the Deep Learning Workbench with remote profiling capability.

Don't miss a new openvino release

NewReleases is sending notifications on new releases.