Spaces:
Configuration error
Configuration error
license: apache-2.0 | |
emoji: π | |
pinned: true | |
## Hi there π | |
 | |
Welcome to the official Hugging Face organisation homepage for inclusionAI. | |
This organisation contains the series of open-source projects from Ant Group with dedicated efforts to work towards Artificial General Intelligence (AGI). | |
In here you can find Large Language Models (LLM), Reinforcement Learning (RL) or other systems related to model training and inference, and other AGI-related frameworks or applications. | |
### Get Involved | |
Our work is guided by the principles of fairness, transparency, and collaboration, and we are dedicated to creating models that reflect the diversity of the world we live in. | |
Whether you're a researcher, developer, or simply someone passionate about AI, we invite you to join us in our mission to create AI that benefits everyone. | |
- **Explore Our Models**: Check out our latest models and datasets on the inclusionAI Hub. | |
- **Contribute**: Interested in contributing? Visit our [GitHub](https://github.com/inclusionAI) repository to get started. | |
- **Join the Conversation**: Connect with us on [Twitter](https://x.com/ant_oss) and [Discord](https://discord.gg/2X4zBSz9c6) to stay updated on our latest projects and initiatives. | |
## Our Models | |
- [**Ling**](https://huggingface.co/collections/inclusionAI/ling-67c51c85b34a7ea0aba94c32): Ling is an MoE LLM provided and open-sourced by InclusionAI. | |
- [**Ming**](https://huggingface.co/collections/inclusionAI/ming-680afbb62c4b584e1d7848e8): Ming-Omni is a unified multimodal model capable of processing images, text, audio, and video, while demonstrating strong proficiency in both speech and image generation. | |
- [**Ring**](https://huggingface.co/collections/inclusionAI/ring-67e7e41bba868546ac32b260): Ring is a reasoning MoE LLM provided and open-sourced by InclusionAI, derived from Ling. | |
- [**GroveMoE**](https://huggingface.co/collections/inclusionAI/grovemoe-68a2b58acbb55827244ef664): GroveMoE is an open-source family of LLMs developed by the AGI Center, Ant Research Institute. | |
- ... | |
 | |
## What's New | |
- [2025/8/29] βπ» [inclusionAI/Qwen3-32B-AWorld](https://huggingface.co/inclusionAI/Qwen3-32B-AWorld) | |
- [2025/8/18] βπ» [inclusionAI/GroveMoE-Base](https://huggingface.co/inclusionAI/GroveMoE-Base) | |
- [2025/8/18] βπ» [inclusionAI/GroveMoE-Inst](https://huggingface.co/inclusionAI/GroveMoE-Inst) | |
- [2025/8/18] βπ» [inclusionAI/Rubicon-Preview](https://huggingface.co/inclusionAI/Rubicon-Preview) | |
- [2025/8/18] πΌοΈ [inclusionAI/UI-Venus-Navi-72B](https://huggingface.co/inclusionAI/UI-Venus-Navi-72B) | |
- [2025/8/16] πΌοΈ [inclusionAI/UI-Venus-Ground-7B](https://huggingface.co/inclusionAI/UI-Venus-Ground-7B) | |
- [2025/8/16] πΌοΈ [inclusionAI/UI-Venus-Navi-7B](https://huggingface.co/inclusionAI/UI-Venus-Navi-7B) | |
- [2025/8/16] πΌοΈ [inclusionAI/UI-Venus-Ground-72B](https://huggingface.co/inclusionAI/UI-Venus-Ground-72B) | |
- [2025/8/16] πΌοΈ [inclusionAI/GUI-G2-7B](https://huggingface.co/inclusionAI/GUI-G2-7B) | |
- [2025/8/15] πΌοΈ [inclusionAI/GUI-G2-3B](https://huggingface.co/inclusionAI/GUI-G2-3B) |