PaddleOCR ONNX Models

πŸ”₯ ONNX format models converted from PaddleOCR for easy deployment and testing

πŸ“– Model Description

This repository contains ONNX format models converted from PaddleOCR, a practical ultra-lightweight OCR system. These models are optimized for production deployment and cross-platform compatibility.

πŸ“¦ Model Files

File Name Size Description
PP-OCRv5_server_det_infer.onnx 84MB Text detection model - locates text regions in images
PP-OCRv5_server_rec_infer.onnx 81MB Text recognition model - recognizes text content
UVDoc_infer.onnx 30MB Document rectification model - corrects document perspective
PP-LCNet_x1_0_doc_ori_infer.onnx 6.5MB Document orientation detection
PP-LCNet_x1_0_textline_ori_infer.onnx 6.5MB Text line orientation detection
PP-OCRv5_server_rec_infer.yml 145KB Recognition model configuration file

Total Size: ~208MB

πŸš€ Quick Start

Installation

pip install huggingface_hub onnxruntime

Download Models

from huggingface_hub import hf_hub_download
import os

def download_paddleocr_models():
    """Download all PaddleOCR ONNX models"""
    model_files = [
        "PP-OCRv5_server_det_infer.onnx",
        "PP-OCRv5_server_rec_infer.onnx", 
        "UVDoc_infer.onnx",
        "PP-LCNet_x1_0_doc_ori_infer.onnx",
        "PP-LCNet_x1_0_textline_ori_infer.onnx",
        "PP-OCRv5_server_rec_infer.yml"
    ]
    
    cache_dir = "models"
    os.makedirs(cache_dir, exist_ok=True)
    
    for file in model_files:
        print(f"Downloading {file}...")
        hf_hub_download(
            repo_id="marsena/paddleocr-test",
            filename=file,
            local_dir=cache_dir
        )
    print("All models downloaded!")

# Download models
download_paddleocr_models()

Basic Usage

import onnxruntime as ort
import numpy as np
from PIL import Image

# Load detection model
det_session = ort.InferenceSession("models/PP-OCRv5_server_det_infer.onnx")

# Load recognition model  
rec_session = ort.InferenceSession("models/PP-OCRv5_server_rec_infer.onnx")

# Your OCR pipeline implementation here...

🏷️ Model Tags

  • Framework: ONNX
  • Task: Computer Vision, OCR
  • Language: Multi-language support
  • Domain: Text Detection, Text Recognition

πŸ”§ Technical Details

Conversion Process

These models were converted from PaddlePaddle format to ONNX format for broader compatibility:

  1. Source: Original PaddleOCR models from PaddlePaddle Hub
  2. Conversion: PaddlePaddle β†’ ONNX format
  3. Optimization: Model optimization for inference speed
  4. Validation: Output consistency verification

System Requirements

  • Runtime: ONNX Runtime
  • Python: 3.7+
  • Memory: Minimum 2GB RAM recommended
  • Platform: Cross-platform (Windows, Linux, macOS)

πŸ“„ License

This project follows the Apache 2.0 License, consistent with the original PaddleOCR project.

Original PaddleOCR License

Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

πŸ™ Acknowledgments

  • Original Project: PaddleOCR by PaddlePaddle Team
  • Framework: PaddlePaddle
  • Conversion Tools: ONNX ecosystem

πŸ“š Citation

If you use these models in your research, please cite the original PaddleOCR paper:

@misc{paddleocr2020,
    title={PaddleOCR: Awesome multilingual OCR toolkits},
    author={PaddlePaddle Authors},
    year={2020},
    howpublished={\url{https://github.com/PaddlePaddle/PaddleOCR}}
}

❓ Issues & Support

For issues related to:


Note: This is a community contribution for easier deployment of PaddleOCR models. For production use, please ensure compliance with your specific requirements and test thoroughly.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support