MODEL NAME | PROVIDER | PARAMETERS |
SCORE |
RELEASE DATE |
COMPARE |
---|---|---|---|---|---|
DeepSeek-V3 outperforms other open-source models such as Qwen2.5-72B and Llama-3.1-405B in multiple evaluations and matches the performance of top-tier closed-source models like GPT-4 and Claude-3.5-Sonnet.
|
DeepSeek | 671B | 5.4 | March 24, 2025 | |
o4-mini in high mode, which has a longer response time and stronger reasoning ability.
|
OpenAI | N/A | 6.2 | April 16, 2025 | |
The latest open-sourced MoE model released by Tencent
|
Tencent | 80B | 5.1 | June 27, 2025 | |
The latest model launched by Moonshot AI, featuring enhanced coding capabilities and superior performance in general Agent tasks, built on an MoE architecture.
|
Moonshot | 1000B | 5.5 | July 11, 2025 | |
The Bailing Large Language Model is a general-purpose LLM that has completed the generative AI filing process, trained on trillions of tokens. It adopts the version number Bailing-Pro-20250225.
|
Ant Group | N/A | 4.4 | February 25, 2025 | |
The inference model Spark X1 released by iFlytek, on the basis of leading domestic mathematical tasks, benchmarks the performance of general tasks such as inference, text generation, and language understanding against OpenAI o series and DeepSeek R1.
|
iFLYTEK | N/A | 5.4 | July 20, 2025 | |
MiniMax-Text-01 is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax-Text-01 adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE).
|
MiniMax | 456B | 3.8 | January 15, 2025 | |
The new GLM-4.5 reasoning model series released by GLM
|
Zhipu AI | 110B | 5.6 | July 29, 2025 | |
The world's first open-weight, large-scale hybrid-attention reasoning model released by Minimax
|
MiniMax | 456B | 5.5 | June 17, 2025 | |
Qwen3 is the latest generation of large language models in Qwen series, offering a comprehensive suite of dense and mixture-of-experts (MoE) models.
|
Alibaba | 235B | 5.7 | July 22, 2025 |