Mixtral 8x7B Instruct v0.1

Mixtral 8x7B Instruct v0.1

A 7B sparse Mixture-of-Experts model with stronger capabilities than Mistral 7B. Uses 12B active parameters out of 45B total. Supports multiple languages, code and 32k context window.

Fanatic about AI? We are looking for like minded people. Please Join our discord: Teratalker discord

© 2024 Teratalker. All rights reserved.