A dense AI model with 32B parameters, excelling in coding, math, and local deployment. Compact, efficient, and powerful ...
Alibaba’s QWQ-32B is a 32-billion-parameter AI designed for mathematical reasoning and coding. Unlike massive models, it ...
While DeepSeek-R1 operates with 671 billion parameters, QwQ-32B achieves comparable performance with a much smaller footprint ...
Alibaba (BABA) is stepping up its efforts in the AI race. The company has launched an upgraded version of its AI assistant app, Quark, now ...
Tech giant Alibaba, which has pledged to invest heavily in artificial intelligence, says its new reasoning model rivals ...
Alibaba is positioned to dominate China's AI market with its groundbreaking, highly efficient QwQ-32B model, surpassing ...
This remarkable outcome underscores the effectiveness of RL when applied to robust foundation models pre-trained on extensive ...
Alibaba released and open-sourced its new reasoning model, QwQ-32B, featuring 32 billion parameters. Despite being ...
Alibaba developed QwQ-32B through two training sessions. The first session focused on teaching the model math and coding ...
The launch comes as its latest effort to gain an edge amid growing competition in AI application front, further intensified ...
B, an AI model rivaling OpenAI and DeepSeek with 98% lower compute costs. A game-changer in AI efficiency, boosting Alibaba’s ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results