Back    Zoom +    Zoom -
Alibaba Tongyi Qianwen QwQ-32B Inference Model Tops Global Open-Source Community Rankings
Recommend
32
Positive
57
Negative
20
Hugging Face, the world’s largest AI open-source community, updated its foundation model leaderboard, revealing that Alibaba’s newly released and open-sourced Tongyi Qianwen inference model, QwQ-32B, claimed the top spot.

The Alibaba Tongyi Qianwen QwQ-32B, a foundation model with 32 billion parameters, marks a qualitative leap in mathematics, coding, and general capabilities.

Related NewsHSBC Research Constructive on CN AI Stocks in Mid-to-Long Term, Expects Baidu/ Lenovo/ Alibaba to Catch Up
It achieves overall performance comparable to DeepSeek-R1 with fewer parameters, while breaking new ground by enabling high-performance inference models to run locally on consumption-grade graphics cards, sharply reducing deployment costs.

Since 2023, BABA-W (09988.HK) (BABA.US) has open-sourced over 200 models globally. To date, the Qwen family of foundation models has spawned more than 100,000 derivative models within AI open-source communities at home and abroad, establishing it as the world’s largest open-source model cluster.
AAStocks Financial News