Model Introduction
The Doubao-1.5-pro model utilizes a MoE (Mixture-of-Experts) architecture and explores the optimal balance between model performance and inference performance through an integrated training-inference design. The specific version being used is named doubao-1.5-pro-32k 250115.
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
5.6
Knowledge coverage scope
Possesses core knowledge of mainstream disciplines, but has limited coverage of cutting-edge interdisciplinary fields.
8.9
Reasoning ability
Can perform logical reasoning with more than three steps, though efficiency drops when handling nonlinear relationships.
8.1