Blog Blog Dataset Model Demo Eval Logs

🤗 HuggingFace Blog Slack | WeChat

OpenResearcher-30B-A3B Overview

OpenResearcher-30B-A3B is an agentic large language model designed for long-horizon deep research fine-tuned from NVIDIA-Nemotron-3-Nano-30B-A3B-Base-BF16 on 96K OpenResearcher dataset with 100+ turns. The dataset is derived by distilling GPT-OSS-120B with native browser tools. More info can be found on the dataset card at OpenResearcher dataset.

The model achieves an impressive 54.8% accuracy on BrowseComp-Plus, surpassing performance of GPT-4.1, Claude-Opus-4, Gemini-2.5-Pro, DeepSeek-R1 and Tongyi-DeepResearch.

OpenResearcher Teaser

Deep Research Benchmark Results

Deep Research Benchmark Results

Evaluate OpenResearcher-30B-A3B

We evaluate OpenResearcher-30B-A3B across a range of deep research benchmarks, including BrowseComp-Plus, BrowseComp, GAIA, xbench-DeepSearch. Please find more details in GitHub.

Quick Start

We provide a quick-start in GitHub that demonstrates how to use OpenResearcher-30B-A3B for deep research.

Citation

@misc{li2025openresearcher,
  title={OpenResearcher: A Fully Open Pipeline for Long-Horizon Deep Research Trajectory Synthesis},
  author={Zhuofeng Li and Dongfu Jiang and Xueguang Ma and Haoxiang Zhang and Ping Nie and Yuyu Zhang and Kai Zou and Jianwen Xie and Yu Zhang and Wenhu Chen},
  year={2025},
  howpublished={\url{https://www.notion.so/OpenResearcher-A-Fully-Open-Pipeline-for-Long-Horizon-Deep-Research-Trajectory-Synthesis-2f7e290627b5800cb3a0cd7e8d6ec0ea}},
  note={Notion Blog}
}
Downloads last month
328
Safetensors
Model size
32B params
Tensor type
F32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for OpenResearcher/OpenResearcher-30B-A3B

Finetuned
(2)
this model

Dataset used to train OpenResearcher/OpenResearcher-30B-A3B

Space using OpenResearcher/OpenResearcher-30B-A3B 1