π YOLOv11l β UI Elements Detection
This model is a fine-tuned version of Ultralytics/YOLO11
, trained to detect UI elements in macOS application screenshots.
It is part of the Screen2AX project β a research effort focused on generating accessibility metadata using computer vision.
π§ Task Overview
- Task: Object Detection
- Target: Individual UI elements
- Supported Labels:
['AXButton', 'AXDisclosureTriangle', 'AXImage', 'AXLink', 'AXTextArea']
This model detects common interactive components typically surfaced in accessibility trees on macOS.
π Dataset
- Training data:
MacPaw/Screen2AX-Element
π How to Use
π§ Install Dependencies
pip install huggingface_hub ultralytics
π§ͺ Load the Model and Run Predictions
from huggingface_hub import hf_hub_download
from ultralytics import YOLO
# Download the model
model_path = hf_hub_download(
repo_id="MacPaw/yolov11l-ui-elements-detection",
filename="ui-elements-detection.pt",
)
# Load and run prediction
model = YOLO(model_path)
results = model.predict("/path/to/your/image")
# Display result
results[0].show()
π License
This model is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0), as inherited from the original YOLOv11 base model.
π Related Projects
βοΈ Citation
If you use this model in your research, please cite the Screen2AX paper:
@misc{muryn2025screen2axvisionbasedapproachautomatic,
title={Screen2AX: Vision-Based Approach for Automatic macOS Accessibility Generation},
author={Viktor Muryn and Marta Sumyk and Mariya Hirna and Sofiya Garkot and Maksym Shamrai},
year={2025},
eprint={2507.16704},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2507.16704},
}
π MacPaw Research
Learn more at https://research.macpaw.com
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for MacPaw/yolov11l-ui-elements-detection
Base model
Ultralytics/YOLO11Dataset used to train MacPaw/yolov11l-ui-elements-detection
Evaluation results
- accuracy@0.5self-reported0.654
- precisionself-reported0.491
- recallself-reported0.434
- f1self-reported0.438
- mAP@0.5self-reported0.466
- mAP@0.5-0.95self-reported0.313