Mixtral 8x7B

Basic Information

Organization
Mistral AI
Parameters
47B
License
Apache 2.0
Last Update
2023-12-15
Links

Benchmark Scores

modelDetail.detailedInfo

Description

Powerful mixture-of-experts model with state-of-the-art performance

Features

  • Mixture of Experts architecture
  • Efficient compute usage
  • Strong multilingual support
  • Code generation
  • Task versatility

Use Cases

  • Development
  • Research
  • Content creation
  • Data analysis
  • Education

Architecture Details

Type
Sparse Mixture of Experts
Context Window
32,000 tokens tokens
Training Data Size
5 trillion tokens