option
Home
List of Al models
MiniMax-M1-80k
Model parameter quantity
456B
Model parameter quantity
Affiliated organization
MiniMax
Affiliated organization
Open Source
License Type
Release time
June 17, 2025
Release time
Model Introduction
The world's first open-weight, large-scale hybrid-attention reasoning model released by Minimax
Swipe left and right to view more
Language comprehension ability Language comprehension ability
Language comprehension ability
Capable of understanding complex contexts and generating logically coherent sentences, though occasionally off in tone control.
7.0
Knowledge coverage scope Knowledge coverage scope
Knowledge coverage scope
Possesses core knowledge of mainstream disciplines, but has limited coverage of cutting-edge interdisciplinary fields.
8.2
Reasoning ability Reasoning ability
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
5.2
Related model
MiniMax-Text-01 MiniMax-Text-01 is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax-Text-01 adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE).
abab6.5 abab6.5 is an API model produced by MiniMax, with the version number being abab6.5. The abab6.5 serie is a trillion-parameter Mixture of Experts (MoE) large language model. The abab6.5 is suitable for complex scenarios, such as application problem calculations, scientific computations, and other similar scenarios. The abab6.5s is suitable for general scenarios.
abab6.5s-chat abab6.5 is an API model produced by MiniMax, with the version number being abab6.5. The abab6.5 serie is a trillion-parameter Mixture of Experts (MoE) large language model. The abab6.5 is suitable for complex scenarios, such as application problem calculations, scientific computations, and other similar scenarios. The abab6.5s is suitable for general scenarios.
MiniMax-Text-01 MiniMax-Text-01 is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax-Text-01 adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE).
abab7-chat-preview The abab7-preview model, produced by MiniMax, is an API model that shows significant improvements over the abab6.5 series in capabilities such as handling long texts, mathematics, and writing.
Relevant documents
TikTok's Ban Crisis Nears Resolution with Potential New App and Sale TikTok Sale Nears Completion as New US Version Prepares for LaunchDespite the TikTok divest-or-ban legislation taking effect in January, the platform has maintained US operations with only a brief one-day shutdown. *The Information* now reports that
Amazon Discontinues Shared Prime Free Shipping Benefits Outside Households Amazon Ends Prime Sharing ProgramAmazon is eliminating the popular feature that allowed Prime members to extend their free shipping benefits to non-household members. According to updated support documentation, this sharing capability will terminate
HMD Scales Back US Operations, Ending Nokia Phone Revival HMD Global, the Finnish company that revitalized Nokia-branded mobile devices through a licensing agreement over the past decade, has announced a significant reduction in its US market presence. The company appears to have halted all direct sales of
Global Startups Must Navigate AI Policy: Key Strategies to Know I notice you're asking me to rewrite content that includes an embedded YouTube iframe. However, I'll follow the strict requirements you initially provided:I must preserve all HTML tags exactly as they appear, without modificationI can only rewrite th
Google Leaks Details of Upcoming Android Design Language: Material 3 Expressive Google Prepares to Unveil Next-Gen Android Design System at I/OGoogle is set to introduce a significant evolution of its Android design language at the upcoming Google I/O developer conference, as revealed through a published event schedule and an ac
Model comparison
Start the comparison
Back to Top
OR