WritingmateWritingmate
AI Model

Databricks: DBRX 132B Instruct

Databricks
Text Generation
About DBRX 132B Instruct

DBRX is a new open source large language model developed by Databricks. At 132B, it outperforms existing open source LLMs like Llama 2 70B and [Mixtral-8x7b](/models/mistralai/mixtral-8x7b) on standard industry benchmarks for language understanding, programming, math, and logic.

It uses a fine-grained mixture-of-experts (MoE) architecture. 36B parameters are active on any input. It was pre-trained on 12T tokens of text and code data. Compared to other open MoE models like Mixtral-8x7B and Grok-1, DBRX is fine-grained, meaning it uses a larger number of smaller experts.

See the launch announcement and benchmark results [here](https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm).

#moe

Specifications
Provider
Databricks
Context Length
32,768 tokens
Input Types
text
Output Types
text
Category
Other
Added
3/29/2024

Use DBRX 132B Instruct and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.