AI Model

LiquidAI: LFM2-24B-A2B

Liquid
Text Generation
About LFM2-24B-A2B

LFM2-24B-A2B is the largest model in the LFM2 family of hybrid architectures designed for efficient on-device deployment. Built as a 24B parameter Mixture-of-Experts model with only 2B active parameters per token, it delivers high-quality generation while maintaining low inference costs. The model fits within 32 GB of RAM, making it practical to run on consumer laptops and desktops without sacrificing capability.

Specifications
Provider
Liquid
Context Length
32,768 tokens
Input Types
text
Output Types
text
Category
Other
Added
2/25/2026

Use LFM2-24B-A2B and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.