- Home
- Large Language Models
- MistralAI Mixtral 8x22B
MistralAI Mixtral 8x22B
Visit ResourceMistral's instruct fine-tuned Mixtral 8x22B model is a multilingual large language model optimized for coding, reasoning, and mathematical tasks, with a 64k context window and efficient use of 39B parameters for cost-effective deployment, suitable for developers, startups, and enterprises seeking advanced NLP capabilities.
Provider: MistralAIProprietaryNo API
ELO: 1227
Context: 65.5K
LLM Specifications
Context Length:65.5K
Pricing
Input Cost:$0.90 / 1M tokens
Output Cost:$0.90 / 1M tokens
Performance Metrics
LMArena ELO score of 1,227 is shown in the highlights above.
Supported Formats
Text