About LFM 40B MoE
Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.
LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.
See the [launch announcement](https://www.liquid.ai/liquid-foundation-models) for benchmarks and more info.
Specifications
- Provider
- Liquid
- Context Length
- 32,768 tokens
- Input Types
- text
- Output Types
- text
- Category
- Other
- Added
- 9/30/2024