option
Home Navigation arrows List of Al models Navigation arrows Qwen2.5-72B-Instruct

Qwen2.5-72B-Instruct

Add comparison
Add comparison
Model parameter quantity
72B
Model parameter quantity
Affiliated organization
Alibaba
Affiliated organization
Open Source
License Type
Release time
September 18, 2024
Release time
Model Introduction
Like Qwen2, the Qwen2.5 language models support up to 128K tokens and can generate up to 8K tokens. They also maintain multilingual support for over 29 languages, including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, Arabic, and more.
Language comprehension ability Language comprehension ability
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
5.9
Knowledge coverage scope Knowledge coverage scope
Knowledge coverage scope
Possesses core knowledge of mainstream disciplines, but has limited coverage of cutting-edge interdisciplinary fields.
8.5
Reasoning ability Reasoning ability
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
6.5
Related model
Qwen2.5-7B-Instruct Like Qwen2, the Qwen2.5 language models support up to 128K tokens and can generate up to 8K tokens. They also maintain multilingual support for over 29 languages, including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, Arabic, and more.
Qwen3-32B (Thinking) Qwen3 is the latest generation of large language models in Qwen series, offering a comprehensive suite of dense and mixture-of-experts (MoE) models.
Qwen1.5-72B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Qwen1.5-7B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Qwen1.5-14B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Relevant documents
Google's Gemini Code Assist Enhances AI Coding with New Agentic Capabilities Gemini Code Assist, Google's AI-powered coding companion, is rolling out exciting new "agentic" features in a preview mode. At the recent Cloud Next conference, Google unveiled how
Microsoft Open-Sources Command-Line Text Editor and More at Build Microsoft Goes All-In on Open Source at Build 2025At this year's Build 2025 conference, Microsoft made some big moves in the open-source world, releasing several key tools and appl
OpenAI Enhances AI Model Behind Its Operator Agent OpenAI Takes Operator to the Next LevelOpenAI is giving its autonomous AI agent, Operator, a major upgrade. The upcoming changes mean Operator will soon rely on a model based on o3
Google’s AI Futures Fund may have to tread carefully Google’s New AI Investment Initiative: A Strategic Shift Amid Regulatory ScrutinyGoogle's recent announcement of an AI Futures Fund marks a bold move in the tech giant's ongoing qu
AI YouTube Thumbnail Generator: Boost Your Video Views The Power of AI in YouTube Thumbnail CreationIn today’s digital landscape, a captivating YouTube thumbnail is crucial for grabbing viewers’ attention. With millions of videos competing for clicks, a striking thumbnail can make all the difference. AI YouTube thumbnail generators have emerged as a gam
Model comparison
Start the comparison
Back to Top
OR