pypi onnx 1.15.0
v1.15.0

latest release: 1.16.0
6 months ago

ONNX v1.15.0 is now available with exciting new features! We would like to thank everyone who contributed to this release! Please visit onnx.ai to learn more about ONNX and associated projects.

Key Updates

ai.onnx opset version increased to 20 with following changes:

  • New Operators (ai.onnx):

    • ImageDecoder a new ImageDecoder operator to be used in preprocessing models
    • RegexFullMatch a new operator for regex matching that is commonly used in feature preprocessing
    • StringConcat takes two string tensors as input and returns the elementwise concatenation of the strings in each tensor
    • StringSplit takes a string tensor as input and splits each element based on a delimiter attribute and a maxsplit attribute
    • AffineGrid Generates a 2D or 3D flow field (sampling grid), given a batch of affine matrices theta
    • Gelu applies gaussian error linear unit function or its approximation to input
  • Operator Updates (ai.onnx):

ai.onnx.ml opset version increased to 4 with following changes:

  • Operator Updates (ai.onnx.ml):
    • LabelEncoder adds keys_as_tensor and values_as_tensor attributes

New functionality:

  • Enable empty list of values as attribute PR#5559
  • Update diff bakend node tests for auto update doc PR#5604
  • Enable pylint checks with Ruff and remove pylint from lintrunner PR#5589
  • Getting onnx to treat inf/-inf as float literals. PR#5528
  • Create the onnxtxt serialization format PR#5524
  • Support JSON as a serialization target PR#5523
  • Support for parsing and printing empty list value as attribute PR#5516
  • Add auto update doc pipeline to help developers update docs PR#5450
  • Implement GELU as function op PR#5277
  • Integrate function-inlining with version-conversion PR#5211
  • Extend function type inference to handle missing optional parameters PR#5169
  • Create repr functions for OpSchema PR#5117
  • Utility to inline model-local functions PR#5105
  • Faster reference implementation for operator Conv based on im2col PR#5069
  • Support textproto as a serialization format PR#5112

ONNX now supports serializing to JSON, Text Proto as well as the ONNX Text Representation

Users are now able to serialize the model proto to a text format by specifying supported file extensions or supplying the format= argument in save_model.

For example

# model: onnx.ModelProto
onnx.save_model(model, "model.json")

will save the model as a json file.

Shape inference enhancements

  • [Spec] output_shape for ConvTranspose should not have batch and channels PR#5400
  • Infer rank where reshape shape is inferred PR#5327

Bug fixes and infrastructure improvements

  • Do not use LFS64 on non-glibc linu PR#5669
  • [Web] Use tensor_dtype_to_np_dtype instead of deprecated function PR#5593
  • Reject absolute path when saving external data PR#5566
  • Support Python editable builds PR#5558
  • Test onnxruntime 1.15 with opset 19/IR 9 and fix test source distribution PR#5376
  • Supports float 8 initializers in ReferenceEvaluator PR#5295
  • Fix check_tensor to work with large models on UNIX PR#5286
  • Fix check_tensor to work with large models on Windows PR#5227
  • Transpose scalar shape inference PR#5204
  • Enable RUFF as a formatter PR#5176
  • correct averagepool kernel shape in dilation test case PR#5158
  • Fix type constraints of Reshape(19) PR#5146
  • Add github action to check urls are valid PR#5434 Y
  • Introduce optional cpplint in CI PR#5396 Y
  • Test the serialization API with custom serializers PR#5315 Y
  • [CI] Use ONNX Hub directly in test_model_zoo CI PR#5267 Y
  • Clean up setup.py in favor of pyproject.toml PR#4879 Y

Documentation updates

  • Merge the two contributing docs and create instructions for updating an op PR#5584
  • [Doc] Update README.md regarding Protobuf update and fix typo in Slice-13 spec PR#5435
  • Generate both onnx and onnx-ml operator docs when ONNX_ML=1 PR#5381
  • Publish md files under docs/ to the documentation site PR#5312
  • Update OpSchema docs to include new methods and classes PR#5297
  • Fix missing examples in documentation for ai.onnx.ml PR#5228
  • Modify OneHot operator explanation PR#5197
  • Update CIPipelines.md PR#5157
  • Extend python API documentation PR#5156
  • Update sphinx to create markdown pages for operators PR#5137

Installation

You can upgrade to the latest release using pip install onnx --upgrade or build from source following the README instructions.

python setup.py develop deprecation

Direct invocation of setup.py is deprecated following https://setuptools.pypa.io/en/latest/deprecated/commands.html. To build ONNX, users should switch to use

# Editable installation
# Before: python setup.py develop
# Now
pip install -e .

# Build wheel
# Before: python setup.py bdist_wheel
# Now
pip install --upgrade build
python -m build .

Contributors

Thanks to these individuals for their contributions in this release since last 1.15.0 release:
@adityagoel4512 @AlexandreEichenberger @andife @AtanasDimitrovQC @BowenBao @cbourjau @ClifHouck @guoyuhong @gramalingam @ilya-lavrenov @jantonguirao @jbachurski @jcwchen @justinchuby @leso-kn @linkerzhang @liqunfu @prasanthpul @slowlyideal @smk2007 @snnn @take-cheeze @xadupre @yuanyao-nv @zhenhuaw-me

Don't miss a new onnx release

NewReleases is sending notifications on new releases.