- Home
- Large Language Models
- MistralAI Mixtral 8x7B
MistralAI Mixtral 8x7B
Visit ResourceMixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts model with 47 billion parameters, optimized for chat and instruction tasks; it offers efficient, scalable performance for developers and enterprises building conversational AI solutions.
Provider: MistralAIProprietaryNo API
ELO: 1192
Context: 32.8K
LLM Specifications
Context Length:32.8K
Max Output:16.4K
Pricing
Input Cost:$0.08 / 1M tokens
Output Cost:$0.24 / 1M tokens
Performance Metrics
LMArena ELO score of 1,192 is shown in the highlights above.
Supported Formats
Text