Home
Login
App
Mixtral 8x7B Instruct v0.1
A 7B sparse Mixture-of-Experts model with stronger capabilities than Mistral 7B. Uses 12B active parameters out of 45B total. Supports multiple languages, code and 32k context window.
Try out this model on teratalker platform
Organizations supported:
OpenAi
Meta
Claude
Aws
Cohere
Ai-21
Mistral-ai
Stability-ai
Language models:
Gpt-4
Gpt-4o
Gpt-4 turbo
Gpt-4o mini
Gpt-3.5 turbo
Sdxl-1-0-v1-0
Mistral-7B-instruct-v0-2
Mistral-large-2402
mixtral-8x7B-instruct-v0-1
Jurassic-2-mid-v1
Jurassic-2-ultra-v1
Command-v14-7
Command-light-v14-7
Command-r-v1
Command-r-v1
Titan-text-g1-express-v1
Titan-text-g1-lite-v1
Claude-v2-1
Claude-v2
Claude-instant-v1-2
Llama-2-chat-13B-v1
Llama-2-chat-70B-v1
Llama-3-70B-instruct-v1
Llama-3-8B-instruct-v1
Fanatic about AI? We are looking for like minded people. Please Join our discord:
Teratalker discord
privacy-policy
terms-conditions
© 2024 Teratalker. All rights reserved.