option
Home
List of Al models
MiniMax-Text-01

MiniMax-Text-01

Add comparison
Add comparison
Model parameter quantity
456B
Model parameter quantity
Affiliated organization
MiniMax
Affiliated organization
Open Source
License Type
Release time
January 15, 2025
Release time
Model Introduction
MiniMax-Text-01 is a 456-billion parameter model combining Lightning Attention, Softmax Attention, and Mixture-of-Experts (MoE). It uses advanced parallel strategies to achieve a training context of 1 million tokens and can handle up to 4 million tokens during inference, showcasing top-tier performance.
Swipe left and right to view more
Language comprehension ability Language comprehension ability
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
6.8
Knowledge coverage scope Knowledge coverage scope
Knowledge coverage scope
Possesses core knowledge of mainstream disciplines, but has limited coverage of cutting-edge interdisciplinary fields.
8.5
Reasoning ability Reasoning ability
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
5.8
Related model
MiniMax-Text-01 MiniMax-Text-01 is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax-Text-01 adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE).
MiniMax-M1-80k The world's first open-weight, large-scale hybrid-attention reasoning model released by Minimax
abab6.5 abab6.5 is an API model produced by MiniMax, with the version number being abab6.5. The abab6.5 serie is a trillion-parameter Mixture of Experts (MoE) large language model. The abab6.5 is suitable for complex scenarios, such as application problem calculations, scientific computations, and other similar scenarios. The abab6.5s is suitable for general scenarios.
abab6.5s-chat abab6.5 is an API model produced by MiniMax, with the version number being abab6.5. The abab6.5 serie is a trillion-parameter Mixture of Experts (MoE) large language model. The abab6.5 is suitable for complex scenarios, such as application problem calculations, scientific computations, and other similar scenarios. The abab6.5s is suitable for general scenarios.
MiniMax-Text-01 MiniMax-Text-01 is a powerful language model with 456 billion total parameters, of which 45.9 billion are activated per token. To better unlock the long context capabilities of the model, MiniMax-Text-01 adopts a hybrid architecture that combines Lightning Attention, Softmax Attention and Mixture-of-Experts (MoE).
Relevant documents
Google Introduces AI-Powered Tools for Gmail, Docs, and Vids Google Unveils AI-Powered Workspace Updates at I/O 2025During its annual developer conference, Google has introduced transformative AI enhancements coming to its Workspace suite, fundamentally changing how users interact with Gmail, Docs, and Vids. T
AWS Launches Bedrock AgentCore: Open-Source Platform for Enterprise AI Agent Development Here is the rewritten HTML content:AWS Launches Bedrock AgentCore for Enterprise AI Agents Amazon Web Services (AWS) is betting big on AI agents transforming business operations, introducing Amazon Bedrock AgentCore—an enterprise-grade platform enab
Akaluli AI Voice Recorder Enhances Productivity & Focus Efficiently In our hyper-connected work environments, maintaining focus during crucial conversations has become increasingly challenging. The Akaluli AI Voice Recorder presents an innovative solution to this modern dilemma by seamlessly capturing, transcribing,
Spotify increases Premium subscription costs in markets outside the US Spotify is implementing subscription price hikes across multiple international markets just days after reporting underwhelming financial performance. The streaming giant confirmed Monday that Premium users throughout Europe, South Asia, the Middle Ea
Cairn RPG: Easy-to-Learn Tabletop System for New Players Want an exciting gateway into tabletop RPGs that won't overwhelm newcomers? Picture organizing an entire adventure with ten complete beginners in just fifteen minutes - starting from character creation to diving into gameplay with a fresh system. Sou
Model comparison
Start the comparison
Back to Top
OR