Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.
Instruct model fine-tuned by Mistral. #moe
Mistral: Mixtral 8x7B Instruct – Recent Activity and Usage Stats
Recent activity on Mixtral 8x7B Instruct
Total usage per day on OpenRouter
Prompt
26.4M
Completion
2.3M
Prompt tokens measure input size. Reasoning tokens show internal thinking before a response. Completion tokens reflect total output length.