Model Introduction
Mixtral 8x22B is a sparse Mixture-of-Experts (SMoE) model that uses only 39B active parameters out of 141B, offering unparalleled cost efficiency for its size.
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
4.1
Knowledge coverage scope
Has a basic encyclopedic knowledge base, but lacks depth and suffers from outdated information.
6.0
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
3.0