Model Introduction
The Ministral-8B-Instruct-2410 Language Model is an instruct fine-tuned model significantly outperforming existing models of similar size, released under the Mistral Research License.
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
4.3
Knowledge coverage scope
Has significant knowledge blind spots, often showing factual errors and repeating outdated information.
4.7
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
4.0