github opendatalab/MinerU mineru-2.6.3-released

2 days ago

What's Changed

  • 2025/10/31 2.6.3 Release
    • Added support for a new backend vlm-mlx-engine, enabling MLX-accelerated inference for the MinerU2.5 model on Apple Silicon devices. Compared to the vlm-transformers backend, vlm-mlx-engine delivers a 100%–200% speed improvement.
    • Bug fixes: #3849, #3859
Parsing Backend pipeline
(Accuracy1 82+)
vlm (Accuracy1 90+)
transformers mlx-engine vllm-engine /
vllm-async-engine
http-client
Backend Features Fast, no hallucinations Good compatibility,
but slower
Faster than transformers Fast, compatible with the vLLM ecosystem Suitable for OpenAI-compatible servers5
Operating System Linux2 / Windows / macOS macOS3 Linux2 / Windows4 Any
CPU inference support Not required
GPU RequirementsVolta or later architectures, 6 GB VRAM or more, or Apple Silicon Apple Silicon Volta or later architectures, 8 GB VRAM or more Not required
Memory Requirements Minimum 16 GB, 32 GB recommended 8 GB
Disk Space Requirements 20 GB or more, SSD recommended 2 GB
Python Version 3.10-3.13

1 Accuracy metric is the End-to-End Evaluation Overall score of OmniDocBench (v1.5), tested on the latest MinerU version.
2 Linux supports only distributions released in 2019 or later.
3 MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher.
4 Windows vLLM support via WSL2(Windows Subsystem for Linux).
5 Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like vLLM, SGLang, or LMDeploy.

  • 2025/10/31 2.6.3 发布
    • 增加新后端vlm-mlx-engine支持,在Apple Silicon设备上支持使用MLX加速MinerU2.5模型推理,相比vlm-transformers后端,vlm-mlx-engine后端速度提升100%~200%。
    • bug修复: #3849 #3859
解析后端 pipeline
(精度1 82+)
vlm (精度1 90+)
transformers mlx-engine vllm-engine /
vllm-async-engine
http-client
后端特性 速度快, 无幻觉 兼容性好, 速度较慢 比transformers快 速度快, 兼容vllm生态 适用于OpenAI兼容服务器5
操作系统 Linux2 / Windows / macOS macOS3 Linux2 / Windows4 不限
CPU推理支持 不需要
GPU要求Volta及以后架构, 6G显存以上或Apple Silicon Apple Silicon Volta及以后架构, 8G显存以上 不需要
内存要求 最低16GB以上, 推荐32GB以上 8GB
磁盘空间要求 20GB以上, 推荐使用SSD 2GB
python版本 3.10-3.13

1 精度指标为OmniDocBench (v1.5)的End-to-End Evaluation Overall分数,基于MinerU最新版本测试
2 Linux仅支持2019年及以后发行版
3 MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用
4 Windows vLLM通过WSL2(适用于 Linux 的 Windows 子系统)实现支持
5 兼容OpenAI API的服务器,如通过vLLM/SGLang/LMDeploy等推理框架部署的本地模型服务器或远程模型服务

New Contributors

Full Changelog: mineru-2.6.2-released...mineru-2.6.3-released

Don't miss a new MinerU release

NewReleases is sending notifications on new releases.