Guillaume Lample - Leaders et innovateurs IA | Profils, étapes clés & projets - xix.ai
option

Découvrez des outils AI de qualité

Rassemblez les principaux outils d'intelligence artificielle du monde pour aider à améliorer l'efficacité du travail

Rechercher des outils AL…
Maison
Célébrité de l’IA
Guillaume Lample
Guillaume Lample

Guillaume Lample

Cofondateur, Mistral AI
Année de naissance  1988
Nationalité  French

Étape importante

2016 A rejoint Facebook AI

A travaillé sur des modèles de langage chez Facebook AI

Mistral AI fondé en 2023

Co-fondé Mistral AI pour développer des modèles open-source

Lancement de Mistral Large 2024

Dirigé le lancement de Mistral Large, un LLM compétitif

Produit IA

Mistral-Large-Instruct-2411 is an advanced dense Large Language Model (LLM) of 123B parameters with state-of-the-art reasoning, knowledge and coding capabilities extending Mistral-Large-Instruct-2407 with better Long Context, Function Calling and System Prompt.

Mistral-Large-Instruct-2411 is an advanced dense Large Language Model (LLM) of 123B parameters with state-of-the-art reasoning, knowledge and coding capabilities extending Mistral-Large-Instruct-2407 with better Long Context, Function Calling and System Prompt.

With 22 billion parameters, Mistral Small v24.09 offers customers a convenient mid-point between Mistral NeMo 12B and Mistral Large 2, providing a cost-effective solution that can be deployed across various platforms and environments.

With 22 billion parameters, Mistral Small v24.09 offers customers a convenient mid-point between Mistral NeMo 12B and Mistral Large 2, providing a cost-effective solution that can be deployed across various platforms and environments.

The Ministral-8B-Instruct-2410 Language Model is an instruct fine-tuned model significantly outperforming existing models of similar size, released under the Mistral Research License.

Mistral-Large-Instruct-2407 is an advanced dense Large Language Model (LLM) of 123B parameters with state-of-the-art reasoning, knowledge and coding capabilities.

Mistral-Large is an API model by Mistral, with three models in the series: Small, Medium, Large, with Large being the largest.

Mistral-Large-Instruct-2407 is an advanced dense Large Language Model (LLM) of 123B parameters with state-of-the-art reasoning, knowledge and coding capabilities.

The Mistral-7B-Instruct-v0.3 Large Language Model (LLM) is an instruct fine-tuned version of the Mistral-7B-v0.3.

Mistral-Large is an API model by Mistral, with three models in the series: Small, Medium, Large, with Large being the largest.

With 22 billion parameters, Mistral Small v24.09 offers customers a convenient mid-point between Mistral NeMo 12B and Mistral Large 2, providing a cost-effective solution that can be deployed across various platforms and environments.

Mistral NeMo offers a large context window of up to 128k tokens. Its reasoning, world knowledge, and coding accuracy are state-of-the-art in its size category. As it relies on standard architecture, Mistral NeMo is easy to use and a drop-in replacement in any system using Mistral 7B.

The Ministral-8B-Instruct-2410 Language Model is an instruct fine-tuned model significantly outperforming existing models of similar size, released under the Mistral Research License.

The Mistral-7B-Instruct-v0.3 Large Language Model (LLM) is an instruct fine-tuned version of the Mistral-7B-v0.3.

Profil personnel

A cofondé Mistral AI, développant des modèles open-source efficaces comme Mistral.

Retour en haut
OR