option
Home
List of Al models
Qwen-Max-0428
Model parameter quantity
N/A
Model parameter quantity
Affiliated organization
Alibaba
Affiliated organization
Closed Source
License Type
Release time
April 28, 2024
Release time
Model Introduction
Qwen-Max is an API model produced by Alibaba. This is version 0428
Swipe left and right to view more
Language comprehension ability Language comprehension ability
Language comprehension ability
Often makes semantic misjudgments, leading to obvious logical disconnects in responses.
6.2
Knowledge coverage scope Knowledge coverage scope
Knowledge coverage scope
Possesses core knowledge of mainstream disciplines, but has limited coverage of cutting-edge interdisciplinary fields.
7.6
Reasoning ability Reasoning ability
Reasoning ability
Unable to maintain coherent reasoning chains, often causing inverted causality or miscalculations.
5.2
Related model
Qwen2.5-7B-Instruct Like Qwen2, the Qwen2.5 language models support up to 128K tokens and can generate up to 8K tokens. They also maintain multilingual support for over 29 languages, including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, Arabic, and more.
Qwen3-32B (Thinking) Qwen3 is the latest generation of large language models in Qwen series, offering a comprehensive suite of dense and mixture-of-experts (MoE) models.
Qwen1.5-72B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Qwen1.5-7B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Qwen1.5-14B-Chat Qwen1.5 is the beta version of Qwen2, maintaining its architecture as a decoder-only transformer model with SwiGLU activation, RoPE, and multi-head attention mechanisms. It offers nine model sizes and has enhanced multilingual and chat model capabilities, supporting a context length of 32,768 tokens. All models have enabled system prompts for roleplaying, and the code supports native implementation in transformers.
Relevant documents
AI-Powered Coloring Pages: Create Stunning Designs with Ease Discover an innovative AI platform that transforms the creation of captivating coloring pages. Perfect for artists, educators, or enthusiasts, this tool offers an intuitive interface and robust featur
Revamp Your Home: AI-Driven Decor with Pinterest & ChatGPT Struggling to redesign your home with countless options? Merge artificial intelligence with Pinterest's visual inspiration to create your ideal space. This guide reveals how to blend Pinterest’s image
AI-Powered Wizard of Oz to Dazzle on Las Vegas Sphere’s Massive Screen Sphere Entertainment recently unveiled plans for an immersive rendition of The Wizard of Oz tailored for its distinctive Las Vegas venue, with new insights revealing how Google and Magnopus are levera
OpenAI Explores 'Sign in with ChatGPT' for Third-Party Apps OpenAI is investigating options for users to access third-party applications using their ChatGPT credentials, according to a webpage released on Tuesday. The company is actively seeking feedback from
AI-Powered Legal Practice: Trends and Strategies for Solo and Small Firms Artificial intelligence is revolutionizing the legal industry, offering transformative tools for solo practitioners and small law firms. Staying ahead requires mastering AI technologies, continuous le
Model comparison
Start the comparison
Back to Top
OR