Mixtral 8x7B Instruct

Basic Information

Organization
Mistral AI
Parameters
47B
License
Apache 2.0
Last Update
2024-01-08
Links

Benchmark Scores

modelDetail.detailedInfo

Description

Mistral's latest instruction-tuned Mixture of Experts model

Features

  • Instruction-tuned MoE
  • Improved chat abilities
  • Strong multilingual support
  • Efficient compute usage
  • Open source deployment

Use Cases

  • Chatbots
  • Virtual assistants
  • Content generation
  • Language processing
  • Educational tools

Architecture Details

Type
Mixture of Experts
Context Window
32,000 tokens tokens
Training Data Size
5 trillion tokens